Feb 28 09:03:27 crc systemd[1]: Starting Kubernetes Kubelet... Feb 28 09:03:27 crc restorecon[4556]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:27 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 09:03:28 crc restorecon[4556]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 28 09:03:28 crc kubenswrapper[4687]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 09:03:28 crc kubenswrapper[4687]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 28 09:03:28 crc kubenswrapper[4687]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 09:03:28 crc kubenswrapper[4687]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 09:03:28 crc kubenswrapper[4687]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 28 09:03:28 crc kubenswrapper[4687]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.507504 4687 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511697 4687 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511725 4687 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511730 4687 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511738 4687 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511745 4687 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511751 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511755 4687 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511761 4687 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511770 4687 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511775 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511780 4687 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511787 4687 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511793 4687 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511799 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511805 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511810 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511815 4687 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511819 4687 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511822 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511826 4687 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511829 4687 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511833 4687 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511837 4687 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511840 4687 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511843 4687 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511847 4687 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511850 4687 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511853 4687 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511857 4687 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511861 4687 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511864 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511868 4687 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511872 4687 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511877 4687 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511882 4687 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511886 4687 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511890 4687 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511894 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511897 4687 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511901 4687 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511905 4687 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511909 4687 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511914 4687 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511918 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511923 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511926 4687 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511931 4687 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511935 4687 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511940 4687 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511944 4687 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511948 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511952 4687 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511956 4687 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511960 4687 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511964 4687 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511967 4687 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511973 4687 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511977 4687 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511980 4687 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511984 4687 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511987 4687 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511990 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511994 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.511997 4687 feature_gate.go:330] unrecognized feature gate: Example Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512001 4687 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512004 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512007 4687 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512011 4687 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512014 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512031 4687 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.512034 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512878 4687 flags.go:64] FLAG: --address="0.0.0.0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512893 4687 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512901 4687 flags.go:64] FLAG: --anonymous-auth="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512907 4687 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512913 4687 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512917 4687 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512922 4687 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512928 4687 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512933 4687 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512939 4687 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512944 4687 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512949 4687 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512953 4687 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512957 4687 flags.go:64] FLAG: --cgroup-root="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512961 4687 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512967 4687 flags.go:64] FLAG: --client-ca-file="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512972 4687 flags.go:64] FLAG: --cloud-config="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512975 4687 flags.go:64] FLAG: --cloud-provider="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512979 4687 flags.go:64] FLAG: --cluster-dns="[]" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512985 4687 flags.go:64] FLAG: --cluster-domain="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512989 4687 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512993 4687 flags.go:64] FLAG: --config-dir="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.512998 4687 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513003 4687 flags.go:64] FLAG: --container-log-max-files="5" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513008 4687 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513012 4687 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513032 4687 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513037 4687 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513041 4687 flags.go:64] FLAG: --contention-profiling="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513045 4687 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513049 4687 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513053 4687 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513057 4687 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513063 4687 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513066 4687 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513070 4687 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513074 4687 flags.go:64] FLAG: --enable-load-reader="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513077 4687 flags.go:64] FLAG: --enable-server="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513081 4687 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513087 4687 flags.go:64] FLAG: --event-burst="100" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513091 4687 flags.go:64] FLAG: --event-qps="50" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513095 4687 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513099 4687 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513102 4687 flags.go:64] FLAG: --eviction-hard="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513108 4687 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513112 4687 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513116 4687 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513122 4687 flags.go:64] FLAG: --eviction-soft="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513126 4687 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513131 4687 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513134 4687 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513138 4687 flags.go:64] FLAG: --experimental-mounter-path="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513142 4687 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513146 4687 flags.go:64] FLAG: --fail-swap-on="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513150 4687 flags.go:64] FLAG: --feature-gates="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513155 4687 flags.go:64] FLAG: --file-check-frequency="20s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513158 4687 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513162 4687 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513166 4687 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513170 4687 flags.go:64] FLAG: --healthz-port="10248" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513174 4687 flags.go:64] FLAG: --help="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513178 4687 flags.go:64] FLAG: --hostname-override="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513181 4687 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513185 4687 flags.go:64] FLAG: --http-check-frequency="20s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513189 4687 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513193 4687 flags.go:64] FLAG: --image-credential-provider-config="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513196 4687 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513200 4687 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513204 4687 flags.go:64] FLAG: --image-service-endpoint="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513208 4687 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513212 4687 flags.go:64] FLAG: --kube-api-burst="100" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513216 4687 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513220 4687 flags.go:64] FLAG: --kube-api-qps="50" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513224 4687 flags.go:64] FLAG: --kube-reserved="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513228 4687 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513232 4687 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513237 4687 flags.go:64] FLAG: --kubelet-cgroups="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513241 4687 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513246 4687 flags.go:64] FLAG: --lock-file="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513252 4687 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513268 4687 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513274 4687 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513281 4687 flags.go:64] FLAG: --log-json-split-stream="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513286 4687 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513291 4687 flags.go:64] FLAG: --log-text-split-stream="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513296 4687 flags.go:64] FLAG: --logging-format="text" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513300 4687 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513306 4687 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513310 4687 flags.go:64] FLAG: --manifest-url="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513314 4687 flags.go:64] FLAG: --manifest-url-header="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513320 4687 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513324 4687 flags.go:64] FLAG: --max-open-files="1000000" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513331 4687 flags.go:64] FLAG: --max-pods="110" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513338 4687 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513342 4687 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513347 4687 flags.go:64] FLAG: --memory-manager-policy="None" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513351 4687 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513355 4687 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513359 4687 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513364 4687 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513375 4687 flags.go:64] FLAG: --node-status-max-images="50" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513380 4687 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513384 4687 flags.go:64] FLAG: --oom-score-adj="-999" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513388 4687 flags.go:64] FLAG: --pod-cidr="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513392 4687 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513400 4687 flags.go:64] FLAG: --pod-manifest-path="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513404 4687 flags.go:64] FLAG: --pod-max-pids="-1" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513408 4687 flags.go:64] FLAG: --pods-per-core="0" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513412 4687 flags.go:64] FLAG: --port="10250" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513416 4687 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513420 4687 flags.go:64] FLAG: --provider-id="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513425 4687 flags.go:64] FLAG: --qos-reserved="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513428 4687 flags.go:64] FLAG: --read-only-port="10255" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513433 4687 flags.go:64] FLAG: --register-node="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513437 4687 flags.go:64] FLAG: --register-schedulable="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513440 4687 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513448 4687 flags.go:64] FLAG: --registry-burst="10" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513458 4687 flags.go:64] FLAG: --registry-qps="5" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513462 4687 flags.go:64] FLAG: --reserved-cpus="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513467 4687 flags.go:64] FLAG: --reserved-memory="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513472 4687 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513476 4687 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513480 4687 flags.go:64] FLAG: --rotate-certificates="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513485 4687 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513488 4687 flags.go:64] FLAG: --runonce="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513493 4687 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513498 4687 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513502 4687 flags.go:64] FLAG: --seccomp-default="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513506 4687 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513510 4687 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513514 4687 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513518 4687 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513523 4687 flags.go:64] FLAG: --storage-driver-password="root" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513527 4687 flags.go:64] FLAG: --storage-driver-secure="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513531 4687 flags.go:64] FLAG: --storage-driver-table="stats" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513535 4687 flags.go:64] FLAG: --storage-driver-user="root" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513539 4687 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513544 4687 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513548 4687 flags.go:64] FLAG: --system-cgroups="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513552 4687 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513558 4687 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513562 4687 flags.go:64] FLAG: --tls-cert-file="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513566 4687 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513573 4687 flags.go:64] FLAG: --tls-min-version="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513577 4687 flags.go:64] FLAG: --tls-private-key-file="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513581 4687 flags.go:64] FLAG: --topology-manager-policy="none" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513585 4687 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513589 4687 flags.go:64] FLAG: --topology-manager-scope="container" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513592 4687 flags.go:64] FLAG: --v="2" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513599 4687 flags.go:64] FLAG: --version="false" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513627 4687 flags.go:64] FLAG: --vmodule="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513637 4687 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.513641 4687 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513739 4687 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513745 4687 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513750 4687 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513754 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513760 4687 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513764 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513769 4687 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513772 4687 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513776 4687 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513780 4687 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513783 4687 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513786 4687 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513790 4687 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513794 4687 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513797 4687 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513801 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513804 4687 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513807 4687 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513811 4687 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513814 4687 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513817 4687 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513821 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513824 4687 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513828 4687 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513831 4687 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513835 4687 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513838 4687 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513841 4687 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513845 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513848 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513852 4687 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513855 4687 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513858 4687 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513862 4687 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513865 4687 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513869 4687 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513876 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513880 4687 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513883 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513887 4687 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513891 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513895 4687 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513899 4687 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513904 4687 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513908 4687 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513912 4687 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513916 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513920 4687 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513923 4687 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513927 4687 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513930 4687 feature_gate.go:330] unrecognized feature gate: Example Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513933 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513937 4687 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513940 4687 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513943 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513947 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513951 4687 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513955 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513959 4687 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513962 4687 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513966 4687 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513969 4687 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513972 4687 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513976 4687 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513979 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513983 4687 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513988 4687 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.513992 4687 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.514004 4687 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.514008 4687 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.514012 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.514329 4687 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.521726 4687 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.521758 4687 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.521979 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.521992 4687 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522005 4687 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522011 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522036 4687 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522040 4687 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522046 4687 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522050 4687 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522058 4687 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522061 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522075 4687 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522080 4687 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522086 4687 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522091 4687 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522097 4687 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522100 4687 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522105 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522110 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522114 4687 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522117 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522123 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522127 4687 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522136 4687 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522140 4687 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522144 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522150 4687 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522158 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522162 4687 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522166 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522170 4687 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522174 4687 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522178 4687 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522182 4687 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522188 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522193 4687 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522200 4687 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522206 4687 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522211 4687 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522215 4687 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522219 4687 feature_gate.go:330] unrecognized feature gate: Example Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522222 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522227 4687 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522231 4687 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522235 4687 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522242 4687 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522246 4687 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522252 4687 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522269 4687 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522273 4687 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522277 4687 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522281 4687 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522285 4687 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522292 4687 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522296 4687 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522300 4687 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522304 4687 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522311 4687 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522315 4687 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522319 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522323 4687 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522330 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522334 4687 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522338 4687 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522342 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522348 4687 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522354 4687 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522359 4687 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522363 4687 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522367 4687 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522371 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.522375 4687 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.522382 4687 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523287 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523639 4687 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523663 4687 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523668 4687 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523674 4687 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523678 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523682 4687 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523686 4687 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523690 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523694 4687 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523697 4687 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523701 4687 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523704 4687 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523707 4687 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523710 4687 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523715 4687 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523718 4687 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523721 4687 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523725 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523730 4687 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523733 4687 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523740 4687 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523744 4687 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523747 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523751 4687 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523754 4687 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523758 4687 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523762 4687 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523766 4687 feature_gate.go:330] unrecognized feature gate: Example Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523770 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523774 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523778 4687 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523781 4687 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523785 4687 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523792 4687 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523795 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523801 4687 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523804 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523808 4687 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523812 4687 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523815 4687 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523821 4687 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523830 4687 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523837 4687 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523842 4687 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523846 4687 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523850 4687 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523854 4687 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523857 4687 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523861 4687 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523868 4687 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523871 4687 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523875 4687 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523881 4687 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523885 4687 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523890 4687 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523894 4687 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523898 4687 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523901 4687 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523905 4687 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523909 4687 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523913 4687 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523917 4687 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523922 4687 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523926 4687 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523931 4687 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523936 4687 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523942 4687 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523946 4687 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523949 4687 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.523953 4687 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.523961 4687 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.524247 4687 server.go:940] "Client rotation is on, will bootstrap in background" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.527357 4687 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.530517 4687 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.531194 4687 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.532306 4687 server.go:997] "Starting client certificate rotation" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.532333 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.532474 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.545939 4687 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.548284 4687 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.548694 4687 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.559420 4687 log.go:25] "Validated CRI v1 runtime API" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.578402 4687 log.go:25] "Validated CRI v1 image API" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.579612 4687 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.583515 4687 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-28-09-00-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.583541 4687 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.599399 4687 manager.go:217] Machine: {Timestamp:2026-02-28 09:03:28.597848575 +0000 UTC m=+0.288417932 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5b9fb325-94af-4056-b5ce-29e2eb30cdd4 BootID:76119540-8bc2-4cd3-a111-0e11e6360590 Filesystems:[{Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:05:d9:75 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:05:d9:75 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:f4:34:17 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:de:29:87 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:68:c3:42 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:ed:90:85 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:2a:53:85:0e:8e:11 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:e4:70:df:62:3e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.599603 4687 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.599739 4687 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.600660 4687 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.600850 4687 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.600893 4687 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.601128 4687 topology_manager.go:138] "Creating topology manager with none policy" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.601141 4687 container_manager_linux.go:303] "Creating device plugin manager" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.601437 4687 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.601474 4687 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.602048 4687 state_mem.go:36] "Initialized new in-memory state store" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.602151 4687 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.603658 4687 kubelet.go:418] "Attempting to sync node with API server" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.603685 4687 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.603705 4687 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.603719 4687 kubelet.go:324] "Adding apiserver pod source" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.603732 4687 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.605875 4687 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.606450 4687 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.606795 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.606826 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.606893 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.606899 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.607239 4687 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608139 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608164 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608172 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608181 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608195 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608201 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608208 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608219 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608227 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608234 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608244 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608252 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.608878 4687 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.609307 4687 server.go:1280] "Started kubelet" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.609723 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.609955 4687 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.609948 4687 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 28 09:03:28 crc systemd[1]: Started Kubernetes Kubelet. Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.611189 4687 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.612343 4687 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.612379 4687 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.612543 4687 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.612555 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.612599 4687 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.612586 4687 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.613011 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="200ms" Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.613327 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.613435 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.614369 4687 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.614393 4687 factory.go:55] Registering systemd factory Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.614405 4687 factory.go:221] Registration of the systemd container factory successfully Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.614969 4687 factory.go:153] Registering CRI-O factory Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.615032 4687 factory.go:221] Registration of the crio container factory successfully Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.615048 4687 server.go:460] "Adding debug handlers to kubelet server" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.615060 4687 factory.go:103] Registering Raw factory Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.614484 4687 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18985da322729fea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.609279978 +0000 UTC m=+0.299849315,LastTimestamp:2026-02-28 09:03:28.609279978 +0000 UTC m=+0.299849315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.615103 4687 manager.go:1196] Started watching for new ooms in manager Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.619485 4687 manager.go:319] Starting recovery of all containers Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627310 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627377 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627389 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627415 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627426 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627437 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627447 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627457 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627469 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627481 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627492 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627501 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627511 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627526 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627535 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627548 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627557 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627566 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627576 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627587 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627598 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627608 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627618 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627629 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627642 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627654 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627668 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627678 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627689 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627697 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627708 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627722 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627748 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627775 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627785 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627797 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627807 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627816 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627827 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627840 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627850 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627861 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627873 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627885 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627897 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627909 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627921 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627933 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627945 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627956 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627968 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.627979 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628036 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628051 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628064 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628079 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628089 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628100 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628110 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628120 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628130 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628142 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628152 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628162 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628172 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628181 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628190 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628214 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628223 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628233 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628242 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628251 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628267 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628280 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628288 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628298 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628307 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628316 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628325 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628337 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628346 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628356 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628366 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628381 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628390 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628399 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628408 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628418 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628427 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.628438 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.629927 4687 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.629975 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.629999 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630039 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630065 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630079 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630092 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630150 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630162 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630174 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630184 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630197 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630209 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630221 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630236 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630270 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630286 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630302 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630318 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630330 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630347 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630360 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630372 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630385 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630396 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630406 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630417 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630428 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630438 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630449 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630460 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630470 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630480 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630490 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630501 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630512 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630521 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630535 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630545 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630554 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630565 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630575 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630589 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630599 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630610 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630617 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630627 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630638 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630650 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630661 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630670 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630679 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630689 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630699 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630710 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630719 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630728 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630736 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630746 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630757 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630769 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630780 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630789 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630799 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630809 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630819 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630829 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630839 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630849 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630860 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630874 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630884 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630894 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630906 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630915 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630926 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630937 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630951 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630964 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630975 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.630985 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631001 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631012 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631062 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631073 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631085 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631095 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631285 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631295 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631306 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631316 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631328 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631339 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631349 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631360 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631370 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631381 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631392 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631401 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631411 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631420 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631431 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631440 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631450 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631459 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631469 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631480 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631489 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631499 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631511 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631520 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631529 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631541 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631551 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631561 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631573 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631585 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631594 4687 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631603 4687 reconstruct.go:97] "Volume reconstruction finished" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.631610 4687 reconciler.go:26] "Reconciler: start to sync state" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.638640 4687 manager.go:324] Recovery completed Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.647600 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.648740 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.648778 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.648789 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.649485 4687 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.649504 4687 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.649525 4687 state_mem.go:36] "Initialized new in-memory state store" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.653469 4687 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.653869 4687 policy_none.go:49] "None policy: Start" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.655447 4687 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.655477 4687 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.655485 4687 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.655512 4687 kubelet.go:2335] "Starting kubelet main sync loop" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.655513 4687 state_mem.go:35] "Initializing new in-memory state store" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.655556 4687 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 28 09:03:28 crc kubenswrapper[4687]: W0228 09:03:28.656669 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.656715 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.696823 4687 manager.go:334] "Starting Device Plugin manager" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.696867 4687 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.696882 4687 server.go:79] "Starting device plugin registration server" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.697231 4687 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.697251 4687 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.697418 4687 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.697539 4687 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.697554 4687 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.703740 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.756254 4687 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.756472 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.758057 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.758119 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.758134 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.758392 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.759350 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.759417 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.759436 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.759464 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.759477 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.759688 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.760599 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.760718 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.760843 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.760777 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761057 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761082 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761089 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761191 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761495 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761739 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.761793 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.762971 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.762981 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763003 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763015 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763005 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763073 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.762983 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763365 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763451 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763665 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763905 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.763939 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.764720 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.764857 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.764966 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.764911 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.765096 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.765110 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.765374 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.765467 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.766361 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.766389 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.766400 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.797684 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.798443 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.798526 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.798600 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.798687 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.799324 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.194:6443: connect: connection refused" node="crc" Feb 28 09:03:28 crc kubenswrapper[4687]: E0228 09:03:28.813648 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="400ms" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833438 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833469 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833488 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833510 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833529 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833545 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833560 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833575 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833592 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833607 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833625 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833648 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833663 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833680 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.833697 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934397 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934445 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934470 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934491 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934510 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934525 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934539 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934545 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934606 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934554 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934615 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934648 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934650 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934728 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934802 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934777 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934845 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934880 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934894 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934933 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.934952 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935000 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935002 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935052 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935067 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935078 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935091 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935114 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935133 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 09:03:28 crc kubenswrapper[4687]: I0228 09:03:28.935177 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:28.999683 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.001320 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.001354 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.001366 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.001403 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:29 crc kubenswrapper[4687]: E0228 09:03:29.001779 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.194:6443: connect: connection refused" node="crc" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.090592 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.109988 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.125465 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.141624 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.145922 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.150282 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2bd83a920fbbd2aea98a75e521621a523a983d47dcf22450d750f0aef61edc3f WatchSource:0}: Error finding container 2bd83a920fbbd2aea98a75e521621a523a983d47dcf22450d750f0aef61edc3f: Status 404 returned error can't find the container with id 2bd83a920fbbd2aea98a75e521621a523a983d47dcf22450d750f0aef61edc3f Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.151572 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b4e652596828ab1df4644db90a8348f46583118f9eb6977658c015809fdfc6eb WatchSource:0}: Error finding container b4e652596828ab1df4644db90a8348f46583118f9eb6977658c015809fdfc6eb: Status 404 returned error can't find the container with id b4e652596828ab1df4644db90a8348f46583118f9eb6977658c015809fdfc6eb Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.158678 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a24ec143bf92dd8c6992bcedb9ad144c05eb60a50bebd8bc017da49415310669 WatchSource:0}: Error finding container a24ec143bf92dd8c6992bcedb9ad144c05eb60a50bebd8bc017da49415310669: Status 404 returned error can't find the container with id a24ec143bf92dd8c6992bcedb9ad144c05eb60a50bebd8bc017da49415310669 Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.160276 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3e0c727857396e2831c5dd66b85659532ec3304d90948f0f5173892a0a091df4 WatchSource:0}: Error finding container 3e0c727857396e2831c5dd66b85659532ec3304d90948f0f5173892a0a091df4: Status 404 returned error can't find the container with id 3e0c727857396e2831c5dd66b85659532ec3304d90948f0f5173892a0a091df4 Feb 28 09:03:29 crc kubenswrapper[4687]: E0228 09:03:29.215306 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="800ms" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.402669 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.404136 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.404179 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.404195 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.404233 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:29 crc kubenswrapper[4687]: E0228 09:03:29.404757 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.194:6443: connect: connection refused" node="crc" Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.515663 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:29 crc kubenswrapper[4687]: E0228 09:03:29.516054 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.611273 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.660887 4687 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="36d443679c1ba64b7888f897b33fe824b1b6f91f96a513b5b29517d87984d9d4" exitCode=0 Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.660935 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"36d443679c1ba64b7888f897b33fe824b1b6f91f96a513b5b29517d87984d9d4"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.661110 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2bd83a920fbbd2aea98a75e521621a523a983d47dcf22450d750f0aef61edc3f"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.661283 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662549 4687 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="558e8f5432b4fa8f24de4aafb76374321304781830d392843fa7ee4b54910a0d" exitCode=0 Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662625 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"558e8f5432b4fa8f24de4aafb76374321304781830d392843fa7ee4b54910a0d"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662666 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b4e652596828ab1df4644db90a8348f46583118f9eb6977658c015809fdfc6eb"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662811 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662816 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662943 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.662957 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.663865 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.663922 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.663942 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.664255 4687 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cf7b5ca3a918d62e8013b0ea088979ca447d141a5f323d959549cfbcd4bea4b7" exitCode=0 Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.664338 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cf7b5ca3a918d62e8013b0ea088979ca447d141a5f323d959549cfbcd4bea4b7"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.664397 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3e0c727857396e2831c5dd66b85659532ec3304d90948f0f5173892a0a091df4"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.664491 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.665507 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.665525 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.665535 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.667330 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c78e20c2e56b7da1c8015367caf37186de5ea7675b3dcf696233ed14753e0d2f"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.667383 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a24ec143bf92dd8c6992bcedb9ad144c05eb60a50bebd8bc017da49415310669"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.671073 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb" exitCode=0 Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.671128 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.671156 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7934c8fd0cf4b3563f158dda89159002ac70094b5ce8ff78e7df660e91c2835"} Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.671272 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.671995 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.672038 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.672051 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.674298 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.675387 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.675434 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:29 crc kubenswrapper[4687]: I0228 09:03:29.675446 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.816377 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:29 crc kubenswrapper[4687]: E0228 09:03:29.816445 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:29 crc kubenswrapper[4687]: W0228 09:03:29.886823 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:29 crc kubenswrapper[4687]: E0228 09:03:29.886895 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:30 crc kubenswrapper[4687]: E0228 09:03:30.016849 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="1.6s" Feb 28 09:03:30 crc kubenswrapper[4687]: W0228 09:03:30.045834 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.194:6443: connect: connection refused Feb 28 09:03:30 crc kubenswrapper[4687]: E0228 09:03:30.045940 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.194:6443: connect: connection refused" logger="UnhandledError" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.205712 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.207404 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.207465 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.207477 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.207516 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:30 crc kubenswrapper[4687]: E0228 09:03:30.208213 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.194:6443: connect: connection refused" node="crc" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.550304 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.675872 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e8ce0c1f2ba580f7ee12ed7f684d405258d60e7694aa4c9d0f38696927f7cea"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.675966 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ecb53432284056d5229aeb973aa1a0c4c756f1eaee17fb89a5e1c808b2d87cf4"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.675982 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ee108e6a0c597bdb84871f4b8891d4d75c345c074c49781e4092b7da5ca74ca8"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.676158 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.677249 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.677318 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.677331 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.678300 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"88b461d02be747da1875053afe9b8e3bce910d348adb362acfe62a34fa85368e"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.678371 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cfdcb7136a403e9c3630656973f166953327dc26baea9cd687b9346ff11d0ba"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.678383 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba1dee1f7d9aa539f838c85e7f096a30ec398ee5229989757a4e4f5fd9ec9072"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.678327 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.679438 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.679482 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.679497 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.681610 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d5454edaf68b80b08a3f1a56af9eed32e47a8032ccd21cc5b456c1766c5cab3"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.681652 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.681669 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.681680 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.681692 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.681760 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.682441 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.682479 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.682490 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.683446 4687 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="de40dd4de62a92a9e70bb08053090059d93c6a8caeaec1281bed498558946c80" exitCode=0 Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.683481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"de40dd4de62a92a9e70bb08053090059d93c6a8caeaec1281bed498558946c80"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.683593 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.684353 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.684386 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.684397 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.685139 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4f88d857dd044cbf0ff4ce263d9471877f623480bc36df1f359e72a938fc5526"} Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.685212 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.685895 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.685925 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.685936 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.776347 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:30 crc kubenswrapper[4687]: I0228 09:03:30.925718 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.690000 4687 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5517ddfcb4cf227b8ff3b7da73728bf1eb3af61b6efec132e7efbba73b8c6500" exitCode=0 Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.690081 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5517ddfcb4cf227b8ff3b7da73728bf1eb3af61b6efec132e7efbba73b8c6500"} Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.690174 4687 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.690253 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.690253 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.690410 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.691515 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.691558 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.691572 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.691721 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.691764 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.691777 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.692438 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.692478 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.692490 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.809222 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.810135 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.810167 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.810180 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:31 crc kubenswrapper[4687]: I0228 09:03:31.810203 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.695442 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696011 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa3aea6917f9e5c3e6e15755e2a10aff8dc2e15d4349fa0b9701629492e3a93b"} Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696072 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c15bee19978b8a8e1338cfd61b13bdc7078a9782646ec5b9cf4da7aebdf4ffe"} Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696085 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"387ea2739ecabe3f8fb9959f7752dd572aa92581d44edc2b6c8ad0acf27f908a"} Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696095 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e800eec7818c8442a50bdf7148c8ec89c582584d4967abf057548624df16c70a"} Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696104 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"296d4c65c975b872adb66853a55ae4072168a67dcd453e21e2894dfe69533104"} Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696180 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696218 4687 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696263 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696739 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696787 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696798 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696892 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696920 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.696931 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.697007 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.697051 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.697060 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:32 crc kubenswrapper[4687]: I0228 09:03:32.825963 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:33 crc kubenswrapper[4687]: I0228 09:03:33.697371 4687 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 09:03:33 crc kubenswrapper[4687]: I0228 09:03:33.697441 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:33 crc kubenswrapper[4687]: I0228 09:03:33.698577 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:33 crc kubenswrapper[4687]: I0228 09:03:33.698621 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:33 crc kubenswrapper[4687]: I0228 09:03:33.698633 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:35 crc kubenswrapper[4687]: I0228 09:03:35.716743 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:35 crc kubenswrapper[4687]: I0228 09:03:35.716881 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:35 crc kubenswrapper[4687]: I0228 09:03:35.717759 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:35 crc kubenswrapper[4687]: I0228 09:03:35.717792 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:35 crc kubenswrapper[4687]: I0228 09:03:35.717801 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.614806 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.615053 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.616546 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.616582 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.616593 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.963217 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.963402 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.964664 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.964701 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.964713 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:36 crc kubenswrapper[4687]: I0228 09:03:36.967879 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.249596 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.249839 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.251038 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.251085 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.251096 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.706501 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.706668 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.707310 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.707358 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:37 crc kubenswrapper[4687]: I0228 09:03:37.707369 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.448250 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.448546 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.449929 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.449991 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.450004 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:38 crc kubenswrapper[4687]: E0228 09:03:38.703903 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.708601 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.709437 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.709466 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:38 crc kubenswrapper[4687]: I0228 09:03:38.709476 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:39 crc kubenswrapper[4687]: I0228 09:03:39.661403 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:39 crc kubenswrapper[4687]: I0228 09:03:39.713412 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:39 crc kubenswrapper[4687]: I0228 09:03:39.714592 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:39 crc kubenswrapper[4687]: I0228 09:03:39.714664 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:39 crc kubenswrapper[4687]: I0228 09:03:39.714679 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:39 crc kubenswrapper[4687]: I0228 09:03:39.718754 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:40 crc kubenswrapper[4687]: E0228 09:03:40.553316 4687 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 28 09:03:40 crc kubenswrapper[4687]: E0228 09:03:40.587134 4687 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.18985da322729fea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.609279978 +0000 UTC m=+0.299849315,LastTimestamp:2026-02-28 09:03:28.609279978 +0000 UTC m=+0.299849315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:40 crc kubenswrapper[4687]: I0228 09:03:40.611688 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 28 09:03:40 crc kubenswrapper[4687]: I0228 09:03:40.715544 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:40 crc kubenswrapper[4687]: I0228 09:03:40.716446 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:40 crc kubenswrapper[4687]: I0228 09:03:40.716486 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:40 crc kubenswrapper[4687]: I0228 09:03:40.716500 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:41 crc kubenswrapper[4687]: W0228 09:03:41.199653 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.199758 4687 trace.go:236] Trace[1498377450]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Feb-2026 09:03:31.197) (total time: 10002ms): Feb 28 09:03:41 crc kubenswrapper[4687]: Trace[1498377450]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:03:41.199) Feb 28 09:03:41 crc kubenswrapper[4687]: Trace[1498377450]: [10.002054034s] [10.002054034s] END Feb 28 09:03:41 crc kubenswrapper[4687]: E0228 09:03:41.199781 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 28 09:03:41 crc kubenswrapper[4687]: W0228 09:03:41.228715 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z Feb 28 09:03:41 crc kubenswrapper[4687]: E0228 09:03:41.228759 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:41 crc kubenswrapper[4687]: W0228 09:03:41.230345 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z Feb 28 09:03:41 crc kubenswrapper[4687]: E0228 09:03:41.230400 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:41 crc kubenswrapper[4687]: W0228 09:03:41.232986 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z Feb 28 09:03:41 crc kubenswrapper[4687]: E0228 09:03:41.233106 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.233432 4687 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.233489 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 28 09:03:41 crc kubenswrapper[4687]: E0228 09:03:41.233955 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z" interval="3.2s" Feb 28 09:03:41 crc kubenswrapper[4687]: E0228 09:03:41.236314 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.242565 4687 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.242637 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.612243 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:41Z is after 2026-02-23T05:33:13Z Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.720955 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.723519 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d5454edaf68b80b08a3f1a56af9eed32e47a8032ccd21cc5b456c1766c5cab3" exitCode=255 Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.723578 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6d5454edaf68b80b08a3f1a56af9eed32e47a8032ccd21cc5b456c1766c5cab3"} Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.723934 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.724717 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.724755 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.724766 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:41 crc kubenswrapper[4687]: I0228 09:03:41.725257 4687 scope.go:117] "RemoveContainer" containerID="6d5454edaf68b80b08a3f1a56af9eed32e47a8032ccd21cc5b456c1766c5cab3" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.612645 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:42Z is after 2026-02-23T05:33:13Z Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.661411 4687 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.661743 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.728133 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.728645 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.730775 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" exitCode=255 Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.730829 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1"} Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.730914 4687 scope.go:117] "RemoveContainer" containerID="6d5454edaf68b80b08a3f1a56af9eed32e47a8032ccd21cc5b456c1766c5cab3" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.731009 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.731826 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.731865 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.731878 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.732341 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:03:42 crc kubenswrapper[4687]: E0228 09:03:42.732537 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:03:42 crc kubenswrapper[4687]: I0228 09:03:42.831048 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.612767 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:43Z is after 2026-02-23T05:33:13Z Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.734523 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.736936 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.737984 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.738042 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.738055 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.738556 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:03:43 crc kubenswrapper[4687]: E0228 09:03:43.738745 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:03:43 crc kubenswrapper[4687]: I0228 09:03:43.740889 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.436420 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:44 crc kubenswrapper[4687]: E0228 09:03:44.436823 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:44Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.437736 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.437776 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.437789 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.437819 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:44 crc kubenswrapper[4687]: E0228 09:03:44.440070 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:44Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.612333 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:44Z is after 2026-02-23T05:33:13Z Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.739999 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.741120 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.741157 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.741169 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.741809 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:03:44 crc kubenswrapper[4687]: E0228 09:03:44.741991 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:03:44 crc kubenswrapper[4687]: W0228 09:03:44.862479 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:44Z is after 2026-02-23T05:33:13Z Feb 28 09:03:44 crc kubenswrapper[4687]: E0228 09:03:44.862564 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:44 crc kubenswrapper[4687]: I0228 09:03:44.943302 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 09:03:44 crc kubenswrapper[4687]: E0228 09:03:44.946298 4687 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:45 crc kubenswrapper[4687]: W0228 09:03:45.146291 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:45Z is after 2026-02-23T05:33:13Z Feb 28 09:03:45 crc kubenswrapper[4687]: E0228 09:03:45.146387 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:45 crc kubenswrapper[4687]: W0228 09:03:45.265750 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:45Z is after 2026-02-23T05:33:13Z Feb 28 09:03:45 crc kubenswrapper[4687]: E0228 09:03:45.265812 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.384000 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.612937 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:45Z is after 2026-02-23T05:33:13Z Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.717851 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.741875 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.742845 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.742878 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.742889 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:45 crc kubenswrapper[4687]: I0228 09:03:45.743541 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:03:45 crc kubenswrapper[4687]: E0228 09:03:45.743726 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:03:46 crc kubenswrapper[4687]: I0228 09:03:46.613045 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:46Z is after 2026-02-23T05:33:13Z Feb 28 09:03:46 crc kubenswrapper[4687]: I0228 09:03:46.743719 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:46 crc kubenswrapper[4687]: I0228 09:03:46.744549 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:46 crc kubenswrapper[4687]: I0228 09:03:46.744586 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:46 crc kubenswrapper[4687]: I0228 09:03:46.744596 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:46 crc kubenswrapper[4687]: I0228 09:03:46.745079 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:03:46 crc kubenswrapper[4687]: E0228 09:03:46.745251 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:03:47 crc kubenswrapper[4687]: W0228 09:03:47.041054 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:47Z is after 2026-02-23T05:33:13Z Feb 28 09:03:47 crc kubenswrapper[4687]: E0228 09:03:47.041159 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 09:03:47 crc kubenswrapper[4687]: I0228 09:03:47.612837 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:47Z is after 2026-02-23T05:33:13Z Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.466519 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.466675 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.467661 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.467708 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.467718 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.476587 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.612525 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:03:48Z is after 2026-02-23T05:33:13Z Feb 28 09:03:48 crc kubenswrapper[4687]: E0228 09:03:48.704069 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.747673 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.748519 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.748565 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:48 crc kubenswrapper[4687]: I0228 09:03:48.748576 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:49 crc kubenswrapper[4687]: I0228 09:03:49.614445 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.591624 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da322729fea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.609279978 +0000 UTC m=+0.299849315,LastTimestamp:2026-02-28 09:03:28.609279978 +0000 UTC m=+0.299849315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.594840 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.598240 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.601417 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.604589 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da327d4e6e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.699606761 +0000 UTC m=+0.390176098,LastTimestamp:2026-02-28 09:03:28.699606761 +0000 UTC m=+0.390176098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.607952 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.758108818 +0000 UTC m=+0.448678154,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: I0228 09:03:50.611329 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.611386 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.758127623 +0000 UTC m=+0.448696960,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.614454 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd9110\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.758140417 +0000 UTC m=+0.448709754,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.617545 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.759443537 +0000 UTC m=+0.450012875,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.620709 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.759473493 +0000 UTC m=+0.450042830,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.624218 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd9110\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.759483231 +0000 UTC m=+0.450052569,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.627369 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.760710218 +0000 UTC m=+0.451279555,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.630489 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.760836556 +0000 UTC m=+0.451405894,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.633491 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd9110\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.760931554 +0000 UTC m=+0.451500891,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.636795 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.761016354 +0000 UTC m=+0.451585701,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.640290 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.761074062 +0000 UTC m=+0.451643400,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.641229 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd9110\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.76109941 +0000 UTC m=+0.451668748,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.643695 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.762989815 +0000 UTC m=+0.453559152,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.644701 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.76299765 +0000 UTC m=+0.453566986,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.647887 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.763011476 +0000 UTC m=+0.453580812,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.651071 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd9110\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.763038777 +0000 UTC m=+0.453608114,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.654205 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.763062732 +0000 UTC m=+0.453632068,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.657352 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd9110\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd9110 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648794384 +0000 UTC m=+0.339363721,LastTimestamp:2026-02-28 09:03:28.763080134 +0000 UTC m=+0.453649472,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.660370 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd2dbc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd2dbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648768956 +0000 UTC m=+0.339338292,LastTimestamp:2026-02-28 09:03:28.763347447 +0000 UTC m=+0.453916785,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.663456 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18985da324cd6dc3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18985da324cd6dc3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:28.648785347 +0000 UTC m=+0.339354684,LastTimestamp:2026-02-28 09:03:28.763445241 +0000 UTC m=+0.454014578,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.667193 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da34303b51a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.155659034 +0000 UTC m=+0.846228371,LastTimestamp:2026-02-28 09:03:29.155659034 +0000 UTC m=+0.846228371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.670332 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18985da3430423bd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.155687357 +0000 UTC m=+0.846256684,LastTimestamp:2026-02-28 09:03:29.155687357 +0000 UTC m=+0.846256684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.673280 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da34305388c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.15575822 +0000 UTC m=+0.846327557,LastTimestamp:2026-02-28 09:03:29.15575822 +0000 UTC m=+0.846327557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.676164 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da34372a97b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.162930555 +0000 UTC m=+0.853499893,LastTimestamp:2026-02-28 09:03:29.162930555 +0000 UTC m=+0.853499893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.679109 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da34379c7ee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.163397102 +0000 UTC m=+0.853966440,LastTimestamp:2026-02-28 09:03:29.163397102 +0000 UTC m=+0.853966440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.682939 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da35a31df74 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.5445605 +0000 UTC m=+1.235129837,LastTimestamp:2026-02-28 09:03:29.5445605 +0000 UTC m=+1.235129837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.686114 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18985da35a3bcfe2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.545211874 +0000 UTC m=+1.235781212,LastTimestamp:2026-02-28 09:03:29.545211874 +0000 UTC m=+1.235781212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.689039 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da35a3c24d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.545233616 +0000 UTC m=+1.235802952,LastTimestamp:2026-02-28 09:03:29.545233616 +0000 UTC m=+1.235802952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.691902 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da35a5000bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.546535103 +0000 UTC m=+1.237104439,LastTimestamp:2026-02-28 09:03:29.546535103 +0000 UTC m=+1.237104439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.694916 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da35a521d87 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.546673543 +0000 UTC m=+1.237242871,LastTimestamp:2026-02-28 09:03:29.546673543 +0000 UTC m=+1.237242871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.698505 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da35abef06e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.553805422 +0000 UTC m=+1.244374760,LastTimestamp:2026-02-28 09:03:29.553805422 +0000 UTC m=+1.244374760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.701777 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da35ac2cde9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.554058729 +0000 UTC m=+1.244628066,LastTimestamp:2026-02-28 09:03:29.554058729 +0000 UTC m=+1.244628066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.705072 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18985da35ad27a2e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.55508587 +0000 UTC m=+1.245655207,LastTimestamp:2026-02-28 09:03:29.55508587 +0000 UTC m=+1.245655207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.708631 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da35ada0d19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.555582233 +0000 UTC m=+1.246151570,LastTimestamp:2026-02-28 09:03:29.555582233 +0000 UTC m=+1.246151570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.711959 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da35ae3c48d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.556219021 +0000 UTC m=+1.246788358,LastTimestamp:2026-02-28 09:03:29.556219021 +0000 UTC m=+1.246788358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.715096 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da35b1b9e91 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.559879313 +0000 UTC m=+1.250448639,LastTimestamp:2026-02-28 09:03:29.559879313 +0000 UTC m=+1.250448639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.718798 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3615c2296 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.66477071 +0000 UTC m=+1.355340047,LastTimestamp:2026-02-28 09:03:29.66477071 +0000 UTC m=+1.355340047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.722088 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18985da36171c10e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.666187534 +0000 UTC m=+1.356756871,LastTimestamp:2026-02-28 09:03:29.666187534 +0000 UTC m=+1.356756871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.725306 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da361add987 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.670125959 +0000 UTC m=+1.360695296,LastTimestamp:2026-02-28 09:03:29.670125959 +0000 UTC m=+1.360695296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.728728 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da361e4caa5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.673726629 +0000 UTC m=+1.364295967,LastTimestamp:2026-02-28 09:03:29.673726629 +0000 UTC m=+1.364295967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.732195 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da36898785a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.786165338 +0000 UTC m=+1.476734674,LastTimestamp:2026-02-28 09:03:29.786165338 +0000 UTC m=+1.476734674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.735458 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da369422e9c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.79728758 +0000 UTC m=+1.487856917,LastTimestamp:2026-02-28 09:03:29.79728758 +0000 UTC m=+1.487856917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.738678 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da3695163de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.798284254 +0000 UTC m=+1.488853591,LastTimestamp:2026-02-28 09:03:29.798284254 +0000 UTC m=+1.488853591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.741995 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da36b64db17 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.833114391 +0000 UTC m=+1.523683728,LastTimestamp:2026-02-28 09:03:29.833114391 +0000 UTC m=+1.523683728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.745222 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da36b7208a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.833978016 +0000 UTC m=+1.524547353,LastTimestamp:2026-02-28 09:03:29.833978016 +0000 UTC m=+1.524547353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.748545 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18985da36b736c7c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.834069116 +0000 UTC m=+1.524638444,LastTimestamp:2026-02-28 09:03:29.834069116 +0000 UTC m=+1.524638444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.752712 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da36b7aedae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.834560942 +0000 UTC m=+1.525130279,LastTimestamp:2026-02-28 09:03:29.834560942 +0000 UTC m=+1.525130279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.755855 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da36be853f7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.841730551 +0000 UTC m=+1.532299888,LastTimestamp:2026-02-28 09:03:29.841730551 +0000 UTC m=+1.532299888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.758884 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da36bfcd34b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.843073867 +0000 UTC m=+1.533643205,LastTimestamp:2026-02-28 09:03:29.843073867 +0000 UTC m=+1.533643205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.762050 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18985da36c0a2c62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.843948642 +0000 UTC m=+1.534517980,LastTimestamp:2026-02-28 09:03:29.843948642 +0000 UTC m=+1.534517980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.765054 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da36c45d548 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.847858504 +0000 UTC m=+1.538427841,LastTimestamp:2026-02-28 09:03:29.847858504 +0000 UTC m=+1.538427841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.768420 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da36c5b31d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.849258456 +0000 UTC m=+1.539827793,LastTimestamp:2026-02-28 09:03:29.849258456 +0000 UTC m=+1.539827793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.771831 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da36c6035d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.849587153 +0000 UTC m=+1.540156491,LastTimestamp:2026-02-28 09:03:29.849587153 +0000 UTC m=+1.540156491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.775121 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da3711a478a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.92889025 +0000 UTC m=+1.619459587,LastTimestamp:2026-02-28 09:03:29.92889025 +0000 UTC m=+1.619459587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.778140 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da37198f1e6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.937191398 +0000 UTC m=+1.627760735,LastTimestamp:2026-02-28 09:03:29.937191398 +0000 UTC m=+1.627760735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.781293 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da371ab4632 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.938392626 +0000 UTC m=+1.628961964,LastTimestamp:2026-02-28 09:03:29.938392626 +0000 UTC m=+1.628961964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.784548 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da373d99721 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.974982433 +0000 UTC m=+1.665551770,LastTimestamp:2026-02-28 09:03:29.974982433 +0000 UTC m=+1.665551770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.787668 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da3743f147b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.981633659 +0000 UTC m=+1.672202995,LastTimestamp:2026-02-28 09:03:29.981633659 +0000 UTC m=+1.672202995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.790889 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da374513e29 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.982823977 +0000 UTC m=+1.673393313,LastTimestamp:2026-02-28 09:03:29.982823977 +0000 UTC m=+1.673393313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.793977 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da3746103dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.983857629 +0000 UTC m=+1.674426967,LastTimestamp:2026-02-28 09:03:29.983857629 +0000 UTC m=+1.674426967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.800253 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da374dc2cae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.991929006 +0000 UTC m=+1.682498343,LastTimestamp:2026-02-28 09:03:29.991929006 +0000 UTC m=+1.682498343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.804009 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da374f4529d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:29.993511581 +0000 UTC m=+1.684080919,LastTimestamp:2026-02-28 09:03:29.993511581 +0000 UTC m=+1.684080919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.807322 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da37a219b99 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.080365465 +0000 UTC m=+1.770934802,LastTimestamp:2026-02-28 09:03:30.080365465 +0000 UTC m=+1.770934802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.810444 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da37addbfc0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.092695488 +0000 UTC m=+1.783264825,LastTimestamp:2026-02-28 09:03:30.092695488 +0000 UTC m=+1.783264825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.813591 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da3801dccf6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.180779254 +0000 UTC m=+1.871348591,LastTimestamp:2026-02-28 09:03:30.180779254 +0000 UTC m=+1.871348591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.816569 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da380837208 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.187440648 +0000 UTC m=+1.878009986,LastTimestamp:2026-02-28 09:03:30.187440648 +0000 UTC m=+1.878009986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.819658 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da380c4e662 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.191730274 +0000 UTC m=+1.882299611,LastTimestamp:2026-02-28 09:03:30.191730274 +0000 UTC m=+1.882299611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.822969 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da380dc157c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.19324966 +0000 UTC m=+1.883818997,LastTimestamp:2026-02-28 09:03:30.19324966 +0000 UTC m=+1.883818997,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.826323 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18985da381184567 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.197194087 +0000 UTC m=+1.887763425,LastTimestamp:2026-02-28 09:03:30.197194087 +0000 UTC m=+1.887763425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.829689 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da3898cd1f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.339049975 +0000 UTC m=+2.029619312,LastTimestamp:2026-02-28 09:03:30.339049975 +0000 UTC m=+2.029619312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.832994 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da38a104adc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.34766614 +0000 UTC m=+2.038235477,LastTimestamp:2026-02-28 09:03:30.34766614 +0000 UTC m=+2.038235477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.836272 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da38a222814 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.348836884 +0000 UTC m=+2.039406220,LastTimestamp:2026-02-28 09:03:30.348836884 +0000 UTC m=+2.039406220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.839451 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.839663 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da3920f1d0a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.481806602 +0000 UTC m=+2.172375939,LastTimestamp:2026-02-28 09:03:30.481806602 +0000 UTC m=+2.172375939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: I0228 09:03:50.840192 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:50 crc kubenswrapper[4687]: I0228 09:03:50.841374 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:50 crc kubenswrapper[4687]: I0228 09:03:50.841405 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:50 crc kubenswrapper[4687]: I0228 09:03:50.841415 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:50 crc kubenswrapper[4687]: I0228 09:03:50.841449 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.844010 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.844318 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da392a185e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.491401698 +0000 UTC m=+2.181971035,LastTimestamp:2026-02-28 09:03:30.491401698 +0000 UTC m=+2.181971035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.848401 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da39e3ba2fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.686051069 +0000 UTC m=+2.376620395,LastTimestamp:2026-02-28 09:03:30.686051069 +0000 UTC m=+2.376620395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.851684 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3a718ee49 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.834771529 +0000 UTC m=+2.525340876,LastTimestamp:2026-02-28 09:03:30.834771529 +0000 UTC m=+2.525340876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.854835 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3a7b64d36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.845084982 +0000 UTC m=+2.535654319,LastTimestamp:2026-02-28 09:03:30.845084982 +0000 UTC m=+2.535654319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.858508 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3da43bb9d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.693214621 +0000 UTC m=+3.383783959,LastTimestamp:2026-02-28 09:03:31.693214621 +0000 UTC m=+3.383783959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.861680 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3e17bd55a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.814331738 +0000 UTC m=+3.504901075,LastTimestamp:2026-02-28 09:03:31.814331738 +0000 UTC m=+3.504901075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.864495 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3e1d52a15 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.820186133 +0000 UTC m=+3.510755471,LastTimestamp:2026-02-28 09:03:31.820186133 +0000 UTC m=+3.510755471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.867567 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3e1ebecff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.821677823 +0000 UTC m=+3.512247160,LastTimestamp:2026-02-28 09:03:31.821677823 +0000 UTC m=+3.512247160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.870591 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3e9674b21 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.947203361 +0000 UTC m=+3.637772698,LastTimestamp:2026-02-28 09:03:31.947203361 +0000 UTC m=+3.637772698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.873717 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3e9fa0b7c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.95682086 +0000 UTC m=+3.647390197,LastTimestamp:2026-02-28 09:03:31.95682086 +0000 UTC m=+3.647390197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.876888 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3ea0ffe26 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:31.958259238 +0000 UTC m=+3.648828576,LastTimestamp:2026-02-28 09:03:31.958259238 +0000 UTC m=+3.648828576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.879962 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3f1e96250 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.089946704 +0000 UTC m=+3.780516041,LastTimestamp:2026-02-28 09:03:32.089946704 +0000 UTC m=+3.780516041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.883010 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3f2689b9b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.098284443 +0000 UTC m=+3.788853790,LastTimestamp:2026-02-28 09:03:32.098284443 +0000 UTC m=+3.788853790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.886206 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3f27916f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.099364595 +0000 UTC m=+3.789933931,LastTimestamp:2026-02-28 09:03:32.099364595 +0000 UTC m=+3.789933931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.889324 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3fa62dfd5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.232126421 +0000 UTC m=+3.922695758,LastTimestamp:2026-02-28 09:03:32.232126421 +0000 UTC m=+3.922695758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.892327 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3fadac4e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.239983842 +0000 UTC m=+3.930553179,LastTimestamp:2026-02-28 09:03:32.239983842 +0000 UTC m=+3.930553179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.895637 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da3faeb0861 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.241049697 +0000 UTC m=+3.931619034,LastTimestamp:2026-02-28 09:03:32.241049697 +0000 UTC m=+3.931619034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.898884 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da402ac78cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.371167439 +0000 UTC m=+4.061736776,LastTimestamp:2026-02-28 09:03:32.371167439 +0000 UTC m=+4.061736776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.901994 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18985da40310b199 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:32.377735577 +0000 UTC m=+4.068304914,LastTimestamp:2026-02-28 09:03:32.377735577 +0000 UTC m=+4.068304914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.906906 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 09:03:50 crc kubenswrapper[4687]: &Event{ObjectMeta:{kube-apiserver-crc.18985da612e88b21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 28 09:03:50 crc kubenswrapper[4687]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 09:03:50 crc kubenswrapper[4687]: Feb 28 09:03:50 crc kubenswrapper[4687]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:41.233474337 +0000 UTC m=+12.924043674,LastTimestamp:2026-02-28 09:03:41.233474337 +0000 UTC m=+12.924043674,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 09:03:50 crc kubenswrapper[4687]: > Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.909916 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da612e9524b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:41.233525323 +0000 UTC m=+12.924094659,LastTimestamp:2026-02-28 09:03:41.233525323 +0000 UTC m=+12.924094659,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.913149 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18985da612e88b21\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 09:03:50 crc kubenswrapper[4687]: &Event{ObjectMeta:{kube-apiserver-crc.18985da612e88b21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 28 09:03:50 crc kubenswrapper[4687]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 09:03:50 crc kubenswrapper[4687]: Feb 28 09:03:50 crc kubenswrapper[4687]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:41.233474337 +0000 UTC m=+12.924043674,LastTimestamp:2026-02-28 09:03:41.242614194 +0000 UTC m=+12.933183531,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 09:03:50 crc kubenswrapper[4687]: > Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.916135 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18985da612e9524b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da612e9524b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:41.233525323 +0000 UTC m=+12.924094659,LastTimestamp:2026-02-28 09:03:41.242672854 +0000 UTC m=+12.933242191,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.919409 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18985da38a222814\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da38a222814 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.348836884 +0000 UTC m=+2.039406220,LastTimestamp:2026-02-28 09:03:41.726224443 +0000 UTC m=+13.416793780,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.922561 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18985da3920f1d0a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da3920f1d0a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.481806602 +0000 UTC m=+2.172375939,LastTimestamp:2026-02-28 09:03:41.892962957 +0000 UTC m=+13.583532304,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.925557 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18985da392a185e2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18985da392a185e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:30.491401698 +0000 UTC m=+2.181971035,LastTimestamp:2026-02-28 09:03:41.901722978 +0000 UTC m=+13.592292315,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.929575 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 09:03:50 crc kubenswrapper[4687]: &Event{ObjectMeta:{kube-controller-manager-crc.18985da668096fd6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 09:03:50 crc kubenswrapper[4687]: body: Feb 28 09:03:50 crc kubenswrapper[4687]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:42.661693398 +0000 UTC m=+14.352262745,LastTimestamp:2026-02-28 09:03:42.661693398 +0000 UTC m=+14.352262745,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 09:03:50 crc kubenswrapper[4687]: > Feb 28 09:03:50 crc kubenswrapper[4687]: E0228 09:03:50.932983 4687 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da6680c1a87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:42.661868167 +0000 UTC m=+14.352437515,LastTimestamp:2026-02-28 09:03:42.661868167 +0000 UTC m=+14.352437515,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:51 crc kubenswrapper[4687]: I0228 09:03:51.614427 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:52 crc kubenswrapper[4687]: I0228 09:03:52.613813 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:52 crc kubenswrapper[4687]: W0228 09:03:52.633166 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 28 09:03:52 crc kubenswrapper[4687]: E0228 09:03:52.633214 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 09:03:52 crc kubenswrapper[4687]: I0228 09:03:52.661771 4687 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 09:03:52 crc kubenswrapper[4687]: I0228 09:03:52.662501 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 09:03:52 crc kubenswrapper[4687]: E0228 09:03:52.665674 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18985da668096fd6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 09:03:52 crc kubenswrapper[4687]: &Event{ObjectMeta:{kube-controller-manager-crc.18985da668096fd6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 09:03:52 crc kubenswrapper[4687]: body: Feb 28 09:03:52 crc kubenswrapper[4687]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:42.661693398 +0000 UTC m=+14.352262745,LastTimestamp:2026-02-28 09:03:52.66246772 +0000 UTC m=+24.353037057,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 09:03:52 crc kubenswrapper[4687]: > Feb 28 09:03:52 crc kubenswrapper[4687]: E0228 09:03:52.669001 4687 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18985da6680c1a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18985da6680c1a87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:03:42.661868167 +0000 UTC m=+14.352437515,LastTimestamp:2026-02-28 09:03:52.662534236 +0000 UTC m=+24.353103573,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:03:53 crc kubenswrapper[4687]: I0228 09:03:53.613971 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:53 crc kubenswrapper[4687]: I0228 09:03:53.624073 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 09:03:53 crc kubenswrapper[4687]: I0228 09:03:53.636440 4687 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 28 09:03:54 crc kubenswrapper[4687]: I0228 09:03:54.614004 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:55 crc kubenswrapper[4687]: I0228 09:03:55.613931 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:56 crc kubenswrapper[4687]: I0228 09:03:56.614568 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:57 crc kubenswrapper[4687]: W0228 09:03:57.521859 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 28 09:03:57 crc kubenswrapper[4687]: E0228 09:03:57.521937 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 09:03:57 crc kubenswrapper[4687]: I0228 09:03:57.613711 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:57 crc kubenswrapper[4687]: W0228 09:03:57.821400 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:57 crc kubenswrapper[4687]: E0228 09:03:57.821458 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 09:03:57 crc kubenswrapper[4687]: E0228 09:03:57.844110 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 09:03:57 crc kubenswrapper[4687]: I0228 09:03:57.844245 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:57 crc kubenswrapper[4687]: I0228 09:03:57.845237 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:57 crc kubenswrapper[4687]: I0228 09:03:57.845274 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:57 crc kubenswrapper[4687]: I0228 09:03:57.845286 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:57 crc kubenswrapper[4687]: I0228 09:03:57.845309 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:03:57 crc kubenswrapper[4687]: E0228 09:03:57.846366 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 09:03:58 crc kubenswrapper[4687]: I0228 09:03:58.613979 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:58 crc kubenswrapper[4687]: E0228 09:03:58.704746 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:03:58 crc kubenswrapper[4687]: W0228 09:03:58.845471 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 28 09:03:58 crc kubenswrapper[4687]: E0228 09:03:58.845553 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.614370 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.673903 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.674157 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.675740 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.675974 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.675986 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.680509 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.774228 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.774905 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.774939 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:03:59 crc kubenswrapper[4687]: I0228 09:03:59.774951 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:00 crc kubenswrapper[4687]: I0228 09:04:00.613768 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:00 crc kubenswrapper[4687]: I0228 09:04:00.656472 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:00 crc kubenswrapper[4687]: I0228 09:04:00.657646 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:00 crc kubenswrapper[4687]: I0228 09:04:00.657696 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:00 crc kubenswrapper[4687]: I0228 09:04:00.657709 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:00 crc kubenswrapper[4687]: I0228 09:04:00.658576 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.613953 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.779290 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.779672 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.780880 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" exitCode=255 Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.780919 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7"} Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.780965 4687 scope.go:117] "RemoveContainer" containerID="4ef7a656d507e18fb707996716f5d714d93c502e80b0677a494814ba269f79a1" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.781142 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.781855 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.781886 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.781895 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:01 crc kubenswrapper[4687]: I0228 09:04:01.782423 4687 scope.go:117] "RemoveContainer" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" Feb 28 09:04:01 crc kubenswrapper[4687]: E0228 09:04:01.782606 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:02 crc kubenswrapper[4687]: I0228 09:04:02.614162 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:02 crc kubenswrapper[4687]: I0228 09:04:02.784972 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 09:04:03 crc kubenswrapper[4687]: I0228 09:04:03.614046 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:04 crc kubenswrapper[4687]: I0228 09:04:04.613338 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:04 crc kubenswrapper[4687]: I0228 09:04:04.847277 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:04 crc kubenswrapper[4687]: I0228 09:04:04.848191 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:04 crc kubenswrapper[4687]: I0228 09:04:04.848221 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:04 crc kubenswrapper[4687]: I0228 09:04:04.848229 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:04 crc kubenswrapper[4687]: E0228 09:04:04.848230 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 09:04:04 crc kubenswrapper[4687]: I0228 09:04:04.848252 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:04:04 crc kubenswrapper[4687]: E0228 09:04:04.851651 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.383671 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.383838 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.384856 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.384902 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.384913 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.385502 4687 scope.go:117] "RemoveContainer" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" Feb 28 09:04:05 crc kubenswrapper[4687]: E0228 09:04:05.385705 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.614008 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.717659 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.793597 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.794310 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.794345 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.794354 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:05 crc kubenswrapper[4687]: I0228 09:04:05.794801 4687 scope.go:117] "RemoveContainer" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" Feb 28 09:04:05 crc kubenswrapper[4687]: E0228 09:04:05.794962 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:06 crc kubenswrapper[4687]: I0228 09:04:06.612955 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:07 crc kubenswrapper[4687]: I0228 09:04:07.613955 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:08 crc kubenswrapper[4687]: I0228 09:04:08.612991 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:08 crc kubenswrapper[4687]: E0228 09:04:08.705165 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:04:08 crc kubenswrapper[4687]: W0228 09:04:08.922429 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 28 09:04:08 crc kubenswrapper[4687]: E0228 09:04:08.922477 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 09:04:09 crc kubenswrapper[4687]: I0228 09:04:09.613131 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:10 crc kubenswrapper[4687]: I0228 09:04:10.613313 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:11 crc kubenswrapper[4687]: I0228 09:04:11.613294 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:11 crc kubenswrapper[4687]: I0228 09:04:11.852182 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:11 crc kubenswrapper[4687]: E0228 09:04:11.852400 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 09:04:11 crc kubenswrapper[4687]: I0228 09:04:11.853978 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:11 crc kubenswrapper[4687]: I0228 09:04:11.854031 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:11 crc kubenswrapper[4687]: I0228 09:04:11.854042 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:11 crc kubenswrapper[4687]: I0228 09:04:11.854070 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:04:11 crc kubenswrapper[4687]: E0228 09:04:11.857375 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 09:04:12 crc kubenswrapper[4687]: I0228 09:04:12.617224 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:13 crc kubenswrapper[4687]: W0228 09:04:13.214844 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 28 09:04:13 crc kubenswrapper[4687]: E0228 09:04:13.214906 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 09:04:13 crc kubenswrapper[4687]: W0228 09:04:13.406711 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 28 09:04:13 crc kubenswrapper[4687]: E0228 09:04:13.406755 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 28 09:04:13 crc kubenswrapper[4687]: I0228 09:04:13.613690 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:14 crc kubenswrapper[4687]: I0228 09:04:14.613943 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:15 crc kubenswrapper[4687]: I0228 09:04:15.613163 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:16 crc kubenswrapper[4687]: I0228 09:04:16.613596 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:16 crc kubenswrapper[4687]: I0228 09:04:16.619619 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 09:04:16 crc kubenswrapper[4687]: I0228 09:04:16.619805 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:16 crc kubenswrapper[4687]: I0228 09:04:16.621120 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:16 crc kubenswrapper[4687]: I0228 09:04:16.621167 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:16 crc kubenswrapper[4687]: I0228 09:04:16.621176 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:17 crc kubenswrapper[4687]: I0228 09:04:17.613611 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.614131 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.655894 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.657109 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.657149 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.657159 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.657596 4687 scope.go:117] "RemoveContainer" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" Feb 28 09:04:18 crc kubenswrapper[4687]: E0228 09:04:18.657757 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:18 crc kubenswrapper[4687]: E0228 09:04:18.705639 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:04:18 crc kubenswrapper[4687]: E0228 09:04:18.857474 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.857488 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.858388 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.858420 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.858430 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:18 crc kubenswrapper[4687]: I0228 09:04:18.858448 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:04:18 crc kubenswrapper[4687]: E0228 09:04:18.862212 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 09:04:19 crc kubenswrapper[4687]: I0228 09:04:19.613932 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:20 crc kubenswrapper[4687]: I0228 09:04:20.614295 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:21 crc kubenswrapper[4687]: I0228 09:04:21.614544 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:21 crc kubenswrapper[4687]: W0228 09:04:21.675797 4687 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:21 crc kubenswrapper[4687]: E0228 09:04:21.675850 4687 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 09:04:22 crc kubenswrapper[4687]: I0228 09:04:22.614344 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:23 crc kubenswrapper[4687]: I0228 09:04:23.613759 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:24 crc kubenswrapper[4687]: I0228 09:04:24.613233 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:25 crc kubenswrapper[4687]: I0228 09:04:25.615740 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:25 crc kubenswrapper[4687]: E0228 09:04:25.860660 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 09:04:25 crc kubenswrapper[4687]: I0228 09:04:25.862760 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:25 crc kubenswrapper[4687]: I0228 09:04:25.863580 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:25 crc kubenswrapper[4687]: I0228 09:04:25.863628 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:25 crc kubenswrapper[4687]: I0228 09:04:25.863638 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:25 crc kubenswrapper[4687]: I0228 09:04:25.863660 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:04:25 crc kubenswrapper[4687]: E0228 09:04:25.866432 4687 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 09:04:26 crc kubenswrapper[4687]: I0228 09:04:26.613426 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:27 crc kubenswrapper[4687]: I0228 09:04:27.613371 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:28 crc kubenswrapper[4687]: I0228 09:04:28.615001 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:28 crc kubenswrapper[4687]: E0228 09:04:28.706740 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:04:29 crc kubenswrapper[4687]: I0228 09:04:29.614131 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:30 crc kubenswrapper[4687]: I0228 09:04:30.613327 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:31 crc kubenswrapper[4687]: I0228 09:04:31.613889 4687 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 09:04:31 crc kubenswrapper[4687]: I0228 09:04:31.914480 4687 csr.go:261] certificate signing request csr-kj26x is approved, waiting to be issued Feb 28 09:04:31 crc kubenswrapper[4687]: I0228 09:04:31.921845 4687 csr.go:257] certificate signing request csr-kj26x is issued Feb 28 09:04:31 crc kubenswrapper[4687]: I0228 09:04:31.989739 4687 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.533105 4687 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.656671 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.658423 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.658458 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.658474 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.659011 4687 scope.go:117] "RemoveContainer" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.854866 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.856830 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406"} Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.856977 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.857755 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.857788 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.857799 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.867053 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.867949 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.867978 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.867988 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.868094 4687 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.874959 4687 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.875488 4687 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.875537 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.878073 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.878107 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.878120 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.878133 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.878145 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:32Z","lastTransitionTime":"2026-02-28T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.888671 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.893885 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.893912 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.893925 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.893938 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.893947 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:32Z","lastTransitionTime":"2026-02-28T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.901233 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.906345 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.906388 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.906401 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.906418 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.906429 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:32Z","lastTransitionTime":"2026-02-28T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.914338 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.920054 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.920103 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.920114 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.920126 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.920136 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:32Z","lastTransitionTime":"2026-02-28T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.923856 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 01:16:49.318989857 +0000 UTC Feb 28 09:04:32 crc kubenswrapper[4687]: I0228 09:04:32.923886 4687 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6640h12m16.395106532s for next certificate rotation Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.926848 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.926987 4687 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 09:04:32 crc kubenswrapper[4687]: E0228 09:04:32.927042 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.027626 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.128241 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.228978 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.330042 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.430737 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.531481 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.631801 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.732385 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.833036 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.862960 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.863641 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.865961 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" exitCode=255 Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.866035 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406"} Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.866098 4687 scope.go:117] "RemoveContainer" containerID="6d589be4b2d1f4a394386df31be99155f48d5d76cedf451be1568f1759e64ab7" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.866214 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.867132 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.867173 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.867186 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:33 crc kubenswrapper[4687]: I0228 09:04:33.867966 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.868175 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:33 crc kubenswrapper[4687]: E0228 09:04:33.933428 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.034459 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.135014 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.235856 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.336943 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.438078 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.538792 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.639129 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.739770 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.840818 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:34 crc kubenswrapper[4687]: I0228 09:04:34.870365 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 09:04:34 crc kubenswrapper[4687]: E0228 09:04:34.941781 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.042581 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.143521 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.244196 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.344632 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.383924 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.384075 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.384932 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.384964 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.384974 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.385422 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.385617 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.444713 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.545453 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.645654 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.717238 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.746642 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.847639 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.875085 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.876077 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.876135 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.876148 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:35 crc kubenswrapper[4687]: I0228 09:04:35.876822 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.877063 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:35 crc kubenswrapper[4687]: E0228 09:04:35.947882 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.048785 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.149677 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.250463 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.351230 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.451654 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.552485 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.652574 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.753042 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.853999 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:36 crc kubenswrapper[4687]: E0228 09:04:36.954796 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.055057 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.155996 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.256764 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.357597 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.458497 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.559624 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.660132 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.760601 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.861548 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:37 crc kubenswrapper[4687]: E0228 09:04:37.961959 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.062978 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.163238 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.263432 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.364409 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.464524 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.565334 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.666256 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.707579 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.767057 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.867922 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:38 crc kubenswrapper[4687]: E0228 09:04:38.968761 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.069835 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.170162 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.271092 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.372119 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.473119 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.574001 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.674720 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.774851 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.875641 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:39 crc kubenswrapper[4687]: E0228 09:04:39.976342 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.076979 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.177239 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.278356 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.378676 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.479591 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.580593 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.680969 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.781363 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.882208 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:40 crc kubenswrapper[4687]: E0228 09:04:40.983147 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.084192 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.185205 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.285463 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.386518 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.487443 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.588413 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.689257 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.790354 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.890438 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:41 crc kubenswrapper[4687]: E0228 09:04:41.991274 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.092178 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.193163 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.293461 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.394505 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.495240 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.595543 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.696089 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.796494 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.896824 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.996957 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:42 crc kubenswrapper[4687]: E0228 09:04:42.997006 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.000486 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.000523 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.000533 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.000548 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.000559 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:43Z","lastTransitionTime":"2026-02-28T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.011304 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.016794 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.016830 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.016842 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.016861 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.016874 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:43Z","lastTransitionTime":"2026-02-28T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.024533 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.029999 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.030061 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.030075 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.030091 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.030105 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:43Z","lastTransitionTime":"2026-02-28T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.037730 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.043823 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.043884 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.043899 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.043916 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:43 crc kubenswrapper[4687]: I0228 09:04:43.043929 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:43Z","lastTransitionTime":"2026-02-28T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.051574 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.051685 4687 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.097855 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.198007 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.298892 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.399736 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.500823 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.601236 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.701904 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.802585 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:43 crc kubenswrapper[4687]: E0228 09:04:43.903381 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.004443 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.105641 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.206693 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.307789 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.408888 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.509616 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.610390 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.710872 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.811875 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:44 crc kubenswrapper[4687]: E0228 09:04:44.912460 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.013221 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.113709 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.214263 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.314984 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.415328 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.516407 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.617436 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.717561 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.818047 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:45 crc kubenswrapper[4687]: E0228 09:04:45.918858 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.019422 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.119867 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.220071 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.321133 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.421291 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.522285 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.622681 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.723506 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.823816 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:46 crc kubenswrapper[4687]: E0228 09:04:46.924702 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.025855 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.126497 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.227245 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.327706 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.428621 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.529706 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.630645 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.730829 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.831792 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:47 crc kubenswrapper[4687]: E0228 09:04:47.932511 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.033552 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.134201 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.234360 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.335080 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.436166 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.536312 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.636724 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.656431 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.656434 4687 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.657599 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.657649 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.657873 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.657655 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.658049 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.658058 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:48 crc kubenswrapper[4687]: I0228 09:04:48.658458 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.658583 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.708121 4687 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.737840 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.838643 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:48 crc kubenswrapper[4687]: E0228 09:04:48.939598 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.039880 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.140184 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.241236 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.341358 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.442124 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.542214 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.642703 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.742759 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.843596 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:49 crc kubenswrapper[4687]: E0228 09:04:49.944192 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.044955 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.145097 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.245925 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.346015 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.446841 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.547942 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.648494 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.748947 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.849342 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:50 crc kubenswrapper[4687]: E0228 09:04:50.949554 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.050469 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.151244 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.251797 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.351963 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.452829 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.553468 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.653597 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.754601 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.854984 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:51 crc kubenswrapper[4687]: E0228 09:04:51.955583 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.056507 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.156651 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.257513 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.357964 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.458729 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.559775 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.659913 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.760626 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.861412 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:52 crc kubenswrapper[4687]: E0228 09:04:52.961878 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.062469 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.162850 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.179355 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.182768 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.182806 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.182815 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.182828 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.182837 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:53Z","lastTransitionTime":"2026-02-28T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.189669 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.194977 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.195005 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.195013 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.195036 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.195046 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:53Z","lastTransitionTime":"2026-02-28T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.208342 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.217390 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.217417 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.217425 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.217443 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.217452 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:53Z","lastTransitionTime":"2026-02-28T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.223572 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.228058 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.228085 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.228093 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.228120 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.228129 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:53Z","lastTransitionTime":"2026-02-28T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.234557 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"76119540-8bc2-4cd3-a111-0e11e6360590\\\",\\\"systemUUID\\\":\\\"5b9fb325-94af-4056-b5ce-29e2eb30cdd4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.234662 4687 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.263618 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: I0228 09:04:53.331189 4687 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.364463 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.465180 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.565724 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.666173 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.767242 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.868401 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:53 crc kubenswrapper[4687]: E0228 09:04:53.969259 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.070244 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.171133 4687 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.202175 4687 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.251519 4687 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.273419 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.273458 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.273468 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.273482 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.273492 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.375261 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.375300 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.375308 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.375321 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.375331 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.477294 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.477357 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.477366 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.477385 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.477397 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.579593 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.579647 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.579659 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.579678 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.579687 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.649912 4687 apiserver.go:52] "Watching apiserver" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.656386 4687 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.657106 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-x5r5v","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2","openshift-machine-config-operator/machine-config-daemon-sbkqn","openshift-multus/multus-8rkhw","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-pxxbs","openshift-dns/node-resolver-85qxd","openshift-image-registry/node-ca-vw2n2","openshift-multus/network-metrics-daemon-7h597"] Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658002 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658055 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658064 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658095 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658100 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.658109 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658081 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658184 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658445 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.658461 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.658675 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.658889 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.659264 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.659739 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.660200 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.661716 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662057 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662079 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662141 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662152 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.662203 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7h597" podUID="8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662240 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662323 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662341 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662398 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662458 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662518 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662554 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662713 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662740 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662837 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662946 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662962 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662981 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.662948 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663002 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663075 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663155 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663221 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663262 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663307 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663385 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.663572 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.664364 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.665493 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.665792 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.665799 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666087 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666232 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666295 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666297 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666420 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666451 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666407 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.666665 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.673361 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.681603 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.681626 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.681635 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.681647 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.681657 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.681696 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.687881 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.693782 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sbkqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.699714 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b042334-888e-40c1-92ea-72e4fe52be22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjtg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.706249 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8rkhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee9f985-2783-4c64-913f-c471571a46a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jw8z8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8rkhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.712686 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.713415 4687 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.718431 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7h597" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5z2zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5z2zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7h597\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726253 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2fb9df8-4328-4497-a2e3-707301840319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x5r5v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726499 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726531 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726550 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726567 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726585 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726601 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726620 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726636 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726652 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726667 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726681 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726697 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726710 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726726 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726741 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726765 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726781 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726797 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726811 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726808 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726825 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.726841 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727096 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727115 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727241 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727280 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727299 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727314 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727375 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727392 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727409 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727424 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727439 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727454 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727470 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727483 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727500 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727516 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727533 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727547 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727547 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727565 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727571 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727583 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727597 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727602 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727612 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727623 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727660 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727683 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727702 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727720 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727728 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727739 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727769 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727785 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727801 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727802 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727819 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727834 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727852 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727873 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727875 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727888 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.727937 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:04:55.227887887 +0000 UTC m=+86.918457225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727958 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727979 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727994 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728008 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728050 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729274 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729378 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729446 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729539 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729623 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729697 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729780 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729854 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729927 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730011 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730097 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730171 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730242 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730304 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730374 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730448 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730514 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730663 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730737 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730826 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730898 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730967 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731044 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731125 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731199 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731266 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731333 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731401 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731468 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731536 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731602 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731668 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731727 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731807 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731876 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731947 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732058 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732147 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732461 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732579 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732676 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732750 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732837 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732904 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732973 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733062 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733134 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733225 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733323 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733389 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733457 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733526 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733595 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733664 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733960 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734005 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734046 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734086 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734157 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734182 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734205 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734224 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734245 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734267 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734288 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734306 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734327 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734345 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734361 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734380 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732133 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-85qxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a50af8f-7793-4165-980f-140b2700d716\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjrwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-85qxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734401 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734548 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734576 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734650 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734673 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734694 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734712 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734733 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734770 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734790 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734812 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734835 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734862 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734881 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734901 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734921 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734937 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735003 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735048 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735073 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735093 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735129 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735150 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735170 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735209 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735229 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735250 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735362 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735391 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735432 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735455 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735479 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735497 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735518 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735538 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735555 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735574 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735594 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735615 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735635 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735657 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735679 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735696 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735718 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735746 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735775 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735799 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735821 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735840 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735858 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735878 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735899 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735922 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735942 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735962 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735980 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736001 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736033 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736053 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736070 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736090 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736110 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736129 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736159 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736178 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736195 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.727957 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736911 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-conf-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736947 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-systemd\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736967 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-host\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737064 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737096 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjrwg\" (UniqueName: \"kubernetes.io/projected/8a50af8f-7793-4165-980f-140b2700d716-kube-api-access-pjrwg\") pod \"node-resolver-85qxd\" (UID: \"8a50af8f-7793-4165-980f-140b2700d716\") " pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737128 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737153 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-etc-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737178 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-bin\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737204 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737224 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-k8s-cni-cncf-io\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737563 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ee9f985-2783-4c64-913f-c471571a46a3-multus-daemon-config\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737610 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-netd\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737647 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737683 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw8z8\" (UniqueName: \"kubernetes.io/projected/8ee9f985-2783-4c64-913f-c471571a46a3-kube-api-access-jw8z8\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737712 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-os-release\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737733 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e2fb9df8-4328-4497-a2e3-707301840319-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737763 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z2zz\" (UniqueName: \"kubernetes.io/projected/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-kube-api-access-5z2zz\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737786 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737796 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737808 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737705 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.738332 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.738460 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.738539 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.738675 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.738897 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.739128 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.737813 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728304 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728348 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728469 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740189 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728544 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728596 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728617 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728657 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728729 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740247 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740279 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728802 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728851 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728850 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728870 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728966 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728989 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729035 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729261 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729356 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729379 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729495 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729616 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729634 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729632 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729869 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.729976 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730104 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730126 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730219 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730350 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730353 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730468 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730863 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730876 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.730901 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731014 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731174 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731417 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731435 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.731658 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732548 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732637 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732805 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732864 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.732950 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733070 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733117 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733210 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733468 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733592 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733642 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733936 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734391 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734438 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734447 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.733656 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734463 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734667 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734722 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734745 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734881 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734977 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.734995 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735204 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735482 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735618 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735708 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.735772 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736061 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736245 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736428 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736421 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736461 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736791 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736827 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.736834 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.728484 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740289 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740323 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740444 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740531 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740592 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740631 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740685 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740718 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740765 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740787 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740806 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.740991 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741145 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741360 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741690 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741692 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741706 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741722 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741743 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741839 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-cni-bin\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742068 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742108 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742131 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742178 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742264 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742528 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742647 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742672 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742700 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742744 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742847 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742813 4687 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742951 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.741067 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743099 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743138 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.742822 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743313 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743366 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743490 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743500 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743609 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743667 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743694 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743790 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.743940 4687 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744091 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744115 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744122 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744277 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744340 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744379 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.744406 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:55.243983251 +0000 UTC m=+86.934552588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744410 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744459 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-mcd-auth-proxy-config\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744480 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-kubelet\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744499 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-slash\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.743941 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744549 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744654 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744822 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a50af8f-7793-4165-980f-140b2700d716-hosts-file\") pod \"node-resolver-85qxd\" (UID: \"8a50af8f-7793-4165-980f-140b2700d716\") " pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744843 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-kubelet\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744862 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-system-cni-dir\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744880 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g528j\" (UniqueName: \"kubernetes.io/projected/7b042334-888e-40c1-92ea-72e4fe52be22-kube-api-access-g528j\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.744896 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqfcw\" (UniqueName: \"kubernetes.io/projected/4fb29f6b-2e87-454b-966f-5202547e1b6d-kube-api-access-pqfcw\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745012 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-system-cni-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745253 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-systemd-units\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745304 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745323 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745334 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745358 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745368 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2fb9df8-4328-4497-a2e3-707301840319-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745399 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b042334-888e-40c1-92ea-72e4fe52be22-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745421 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdzd\" (UniqueName: \"kubernetes.io/projected/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-kube-api-access-mxdzd\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745484 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745519 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-cnibin\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745536 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-netns\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745576 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-ovn\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745632 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745687 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745737 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745948 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745952 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-cni-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745993 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-os-release\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746042 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-hostroot\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746062 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbms9\" (UniqueName: \"kubernetes.io/projected/e2fb9df8-4328-4497-a2e3-707301840319-kube-api-access-cbms9\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.745993 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746083 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746105 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746125 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746144 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-socket-dir-parent\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746161 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-multus-certs\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746176 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-etc-kubernetes\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746195 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-log-socket\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746210 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746228 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.746234 4687 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746249 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-proxy-tls\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746268 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746284 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-config\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746300 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-cni-multus\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746316 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-cnibin\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746335 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5s7\" (UniqueName: \"kubernetes.io/projected/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-kube-api-access-rk5s7\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746352 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746376 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746394 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746392 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.746429 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:55.246406529 +0000 UTC m=+86.936975856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746463 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b042334-888e-40c1-92ea-72e4fe52be22-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746496 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-rootfs\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746518 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-node-log\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746538 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovn-node-metrics-cert\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746557 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-script-lib\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746575 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-serviceca\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746567 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746596 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ee9f985-2783-4c64-913f-c471571a46a3-cni-binary-copy\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746616 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b042334-888e-40c1-92ea-72e4fe52be22-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746633 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-netns\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746648 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-var-lib-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746658 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746663 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-env-overrides\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.746551 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.747093 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.747712 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vw2n2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxdzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vw2n2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.747743 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.747835 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.748091 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.748332 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.748335 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.747798 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750213 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750296 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750363 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750812 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750839 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750851 4687 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750866 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750875 4687 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750887 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750899 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750909 4687 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750918 4687 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750929 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750943 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750953 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750962 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750970 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750979 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.750988 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751001 4687 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751010 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751044 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751056 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751051 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751067 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751076 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751086 4687 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751095 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751105 4687 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751113 4687 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751121 4687 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751128 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751136 4687 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751143 4687 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751151 4687 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751159 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751168 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751177 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751186 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751195 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751203 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751212 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751220 4687 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751228 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751236 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751244 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751252 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751260 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751269 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751279 4687 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751288 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751296 4687 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751304 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751312 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751321 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751329 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751337 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751345 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751355 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751362 4687 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751371 4687 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751380 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751388 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751397 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751405 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751413 4687 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751421 4687 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751429 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751437 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751446 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751455 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751463 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751472 4687 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751480 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751489 4687 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751498 4687 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751505 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751514 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751522 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751529 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751537 4687 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751547 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751555 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751563 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751570 4687 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751578 4687 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751587 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751594 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751603 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751611 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751619 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751631 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.751639 4687 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755123 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755392 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755492 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755578 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.755582 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755611 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.755638 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.755654 4687 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755676 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.755711 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:55.255694784 +0000 UTC m=+86.946264122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.755979 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.756381 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.756400 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.756887 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.756994 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.757303 4687 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.757425 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:55.257394733 +0000 UTC m=+86.947964070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.757923 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.758325 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.758344 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.758650 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.759231 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.759883 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.759998 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.760050 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.760314 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.760330 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.760397 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.760427 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.760444 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.761237 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.761707 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.762118 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.762423 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.763130 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.767309 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb29f6b-2e87-454b-966f-5202547e1b6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pxxbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.767464 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.773343 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.774734 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.775004 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.778436 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.779923 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.783673 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.783700 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.783711 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.783761 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.783772 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852197 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z2zz\" (UniqueName: \"kubernetes.io/projected/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-kube-api-access-5z2zz\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852230 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852248 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-kubelet\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852263 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-slash\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852282 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852297 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-cni-bin\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852315 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-mcd-auth-proxy-config\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852331 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a50af8f-7793-4165-980f-140b2700d716-hosts-file\") pod \"node-resolver-85qxd\" (UID: \"8a50af8f-7793-4165-980f-140b2700d716\") " pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852347 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-kubelet\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852360 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-system-cni-dir\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852360 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-slash\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852380 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g528j\" (UniqueName: \"kubernetes.io/projected/7b042334-888e-40c1-92ea-72e4fe52be22-kube-api-access-g528j\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852397 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqfcw\" (UniqueName: \"kubernetes.io/projected/4fb29f6b-2e87-454b-966f-5202547e1b6d-kube-api-access-pqfcw\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852413 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-system-cni-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852426 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-systemd-units\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852431 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852441 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2fb9df8-4328-4497-a2e3-707301840319-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852398 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-kubelet\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852456 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b042334-888e-40c1-92ea-72e4fe52be22-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852464 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a50af8f-7793-4165-980f-140b2700d716-hosts-file\") pod \"node-resolver-85qxd\" (UID: \"8a50af8f-7793-4165-980f-140b2700d716\") " pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852514 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-system-cni-dir\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852526 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-system-cni-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852541 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-systemd-units\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852472 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdzd\" (UniqueName: \"kubernetes.io/projected/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-kube-api-access-mxdzd\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852583 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852635 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852646 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-cni-bin\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852679 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-cnibin\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852372 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-kubelet\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852709 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-netns\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852743 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-ovn\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852746 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-cnibin\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852769 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852788 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-cni-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852812 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-os-release\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852830 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-ovn\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852843 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-hostroot\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852870 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-hostroot\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852916 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbms9\" (UniqueName: \"kubernetes.io/projected/e2fb9df8-4328-4497-a2e3-707301840319-kube-api-access-cbms9\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852899 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-netns\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.852994 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-socket-dir-parent\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853095 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-os-release\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853103 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b042334-888e-40c1-92ea-72e4fe52be22-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853116 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853118 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2fb9df8-4328-4497-a2e3-707301840319-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853143 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-multus-certs\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853152 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-socket-dir-parent\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.853205 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854068 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-cni-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854086 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-mcd-auth-proxy-config\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854113 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-etc-kubernetes\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854125 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-multus-certs\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854166 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-log-socket\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854187 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854217 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-log-socket\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854219 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-proxy-tls\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854324 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854346 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-config\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854362 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-cni-multus\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854401 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-cnibin\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854419 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5s7\" (UniqueName: \"kubernetes.io/projected/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-kube-api-access-rk5s7\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854435 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b042334-888e-40c1-92ea-72e4fe52be22-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854459 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-rootfs\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854477 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-var-lib-cni-multus\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854492 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-etc-kubernetes\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854604 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-rootfs\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854665 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-cnibin\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854714 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854731 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854777 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-node-log\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854805 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovn-node-metrics-cert\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854823 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-script-lib\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854857 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-serviceca\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854872 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ee9f985-2783-4c64-913f-c471571a46a3-cni-binary-copy\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.854887 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b042334-888e-40c1-92ea-72e4fe52be22-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855083 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-netns\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855113 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-var-lib-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855132 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-env-overrides\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855153 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-conf-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855197 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-systemd\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855212 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-host\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855228 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855248 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-config\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855282 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-node-log\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855302 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-multus-conf-dir\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855681 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-netns\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855718 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-var-lib-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855738 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-systemd\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.855762 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-host\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.856256 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-script-lib\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.856257 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ee9f985-2783-4c64-913f-c471571a46a3-cni-binary-copy\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.856397 4687 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: E0228 09:04:54.856446 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs podName:8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:55.356434192 +0000 UTC m=+87.047003529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs") pod "network-metrics-daemon-7h597" (UID: "8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.856717 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-env-overrides\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.856859 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-serviceca\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.857720 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b042334-888e-40c1-92ea-72e4fe52be22-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.857854 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjrwg\" (UniqueName: \"kubernetes.io/projected/8a50af8f-7793-4165-980f-140b2700d716-kube-api-access-pjrwg\") pod \"node-resolver-85qxd\" (UID: \"8a50af8f-7793-4165-980f-140b2700d716\") " pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858540 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-etc-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858648 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b042334-888e-40c1-92ea-72e4fe52be22-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858686 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-bin\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858710 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-k8s-cni-cncf-io\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858740 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8ee9f985-2783-4c64-913f-c471571a46a3-host-run-k8s-cni-cncf-io\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858761 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ee9f985-2783-4c64-913f-c471571a46a3-multus-daemon-config\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858645 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-etc-openvswitch\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858713 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-bin\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858783 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-netd\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858827 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8z8\" (UniqueName: \"kubernetes.io/projected/8ee9f985-2783-4c64-913f-c471571a46a3-kube-api-access-jw8z8\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858881 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-netd\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.858969 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-os-release\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859002 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2fb9df8-4328-4497-a2e3-707301840319-os-release\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859045 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e2fb9df8-4328-4497-a2e3-707301840319-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859169 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859187 4687 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859188 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8ee9f985-2783-4c64-913f-c471571a46a3-multus-daemon-config\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859196 4687 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859233 4687 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859244 4687 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859254 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859264 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859272 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859282 4687 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859291 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859299 4687 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859308 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859320 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859331 4687 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859340 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859348 4687 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859358 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859367 4687 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859375 4687 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859383 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859394 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859593 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859601 4687 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859609 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859616 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859623 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859631 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859639 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859647 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859656 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859664 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859671 4687 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859681 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859689 4687 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859698 4687 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859706 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859714 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859722 4687 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859731 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859739 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859750 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859768 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859775 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859783 4687 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859791 4687 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859802 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859810 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859817 4687 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859827 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859834 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859841 4687 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859848 4687 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859859 4687 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859868 4687 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859876 4687 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859884 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859892 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859901 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859909 4687 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859917 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859924 4687 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859932 4687 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859940 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859949 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859958 4687 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859968 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859976 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859986 4687 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.859993 4687 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860000 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860008 4687 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860031 4687 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860041 4687 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860049 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860058 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860066 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860074 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860083 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860091 4687 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860098 4687 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860106 4687 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860116 4687 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860118 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e2fb9df8-4328-4497-a2e3-707301840319-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860124 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860466 4687 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860476 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860485 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860493 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860503 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860511 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860520 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860528 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860536 4687 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860544 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860551 4687 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860559 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860567 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860575 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860583 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860590 4687 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860598 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860607 4687 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860615 4687 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860623 4687 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860630 4687 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860639 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.860436 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovn-node-metrics-cert\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.863197 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-proxy-tls\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.864612 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdzd\" (UniqueName: \"kubernetes.io/projected/ab3c8fdc-f423-43f7-b0d8-484490cdcfdb-kube-api-access-mxdzd\") pod \"node-ca-vw2n2\" (UID: \"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\") " pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.865656 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqfcw\" (UniqueName: \"kubernetes.io/projected/4fb29f6b-2e87-454b-966f-5202547e1b6d-kube-api-access-pqfcw\") pod \"ovnkube-node-pxxbs\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.866245 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g528j\" (UniqueName: \"kubernetes.io/projected/7b042334-888e-40c1-92ea-72e4fe52be22-kube-api-access-g528j\") pod \"ovnkube-control-plane-749d76644c-gjtg2\" (UID: \"7b042334-888e-40c1-92ea-72e4fe52be22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.866393 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z2zz\" (UniqueName: \"kubernetes.io/projected/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-kube-api-access-5z2zz\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.869491 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjrwg\" (UniqueName: \"kubernetes.io/projected/8a50af8f-7793-4165-980f-140b2700d716-kube-api-access-pjrwg\") pod \"node-resolver-85qxd\" (UID: \"8a50af8f-7793-4165-980f-140b2700d716\") " pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.869794 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5s7\" (UniqueName: \"kubernetes.io/projected/dcd48dfa-192a-4a5b-be30-fc7eebc90da1-kube-api-access-rk5s7\") pod \"machine-config-daemon-sbkqn\" (UID: \"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\") " pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.870164 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbms9\" (UniqueName: \"kubernetes.io/projected/e2fb9df8-4328-4497-a2e3-707301840319-kube-api-access-cbms9\") pod \"multus-additional-cni-plugins-x5r5v\" (UID: \"e2fb9df8-4328-4497-a2e3-707301840319\") " pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.871050 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw8z8\" (UniqueName: \"kubernetes.io/projected/8ee9f985-2783-4c64-913f-c471571a46a3-kube-api-access-jw8z8\") pod \"multus-8rkhw\" (UID: \"8ee9f985-2783-4c64-913f-c471571a46a3\") " pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.886159 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.886190 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.886218 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.886232 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.886242 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.972522 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.977322 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8rkhw" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.983064 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.987914 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.987945 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.987954 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.987966 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.987974 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:54Z","lastTransitionTime":"2026-02-28T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.989051 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 09:04:54 crc kubenswrapper[4687]: W0228 09:04:54.992457 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-11c0b6d2e2756ce1a7c650b120b05519db8d1bf8d8f257bca82b26cf2c4db30d WatchSource:0}: Error finding container 11c0b6d2e2756ce1a7c650b120b05519db8d1bf8d8f257bca82b26cf2c4db30d: Status 404 returned error can't find the container with id 11c0b6d2e2756ce1a7c650b120b05519db8d1bf8d8f257bca82b26cf2c4db30d Feb 28 09:04:54 crc kubenswrapper[4687]: I0228 09:04:54.995159 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.001468 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:04:55 crc kubenswrapper[4687]: W0228 09:04:55.003349 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-62ce5ffef48512e5468e6c531d11e56a83cd97db32babda5c03d57b5153e28fc WatchSource:0}: Error finding container 62ce5ffef48512e5468e6c531d11e56a83cd97db32babda5c03d57b5153e28fc: Status 404 returned error can't find the container with id 62ce5ffef48512e5468e6c531d11e56a83cd97db32babda5c03d57b5153e28fc Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.006870 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.011459 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-85qxd" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.017016 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vw2n2" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.022358 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:04:55 crc kubenswrapper[4687]: W0228 09:04:55.035912 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd48dfa_192a_4a5b_be30_fc7eebc90da1.slice/crio-8f4829a5b1c67827a1cf4b061d2886fe9279cffbce96d3a09dc6332e60cb2e64 WatchSource:0}: Error finding container 8f4829a5b1c67827a1cf4b061d2886fe9279cffbce96d3a09dc6332e60cb2e64: Status 404 returned error can't find the container with id 8f4829a5b1c67827a1cf4b061d2886fe9279cffbce96d3a09dc6332e60cb2e64 Feb 28 09:04:55 crc kubenswrapper[4687]: W0228 09:04:55.045097 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b042334_888e_40c1_92ea_72e4fe52be22.slice/crio-987f3233ce8c81abf3ac92e2ebb1b8720dd26acbd8baecac96c285b655e1e38a WatchSource:0}: Error finding container 987f3233ce8c81abf3ac92e2ebb1b8720dd26acbd8baecac96c285b655e1e38a: Status 404 returned error can't find the container with id 987f3233ce8c81abf3ac92e2ebb1b8720dd26acbd8baecac96c285b655e1e38a Feb 28 09:04:55 crc kubenswrapper[4687]: W0228 09:04:55.045669 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a50af8f_7793_4165_980f_140b2700d716.slice/crio-7c087c2d2783f7c28535ecac11a2da44b63acae0b71b0c0e38f7b76c30c39fd5 WatchSource:0}: Error finding container 7c087c2d2783f7c28535ecac11a2da44b63acae0b71b0c0e38f7b76c30c39fd5: Status 404 returned error can't find the container with id 7c087c2d2783f7c28535ecac11a2da44b63acae0b71b0c0e38f7b76c30c39fd5 Feb 28 09:04:55 crc kubenswrapper[4687]: W0228 09:04:55.054128 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb29f6b_2e87_454b_966f_5202547e1b6d.slice/crio-99eb6843f3a9b1bb0df85a41197c4957994630f01196752b0dd5e9b8984d629a WatchSource:0}: Error finding container 99eb6843f3a9b1bb0df85a41197c4957994630f01196752b0dd5e9b8984d629a: Status 404 returned error can't find the container with id 99eb6843f3a9b1bb0df85a41197c4957994630f01196752b0dd5e9b8984d629a Feb 28 09:04:55 crc kubenswrapper[4687]: W0228 09:04:55.058280 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3c8fdc_f423_43f7_b0d8_484490cdcfdb.slice/crio-609ce601719681a104fa7ac33f10da891268c99859b8e1203733635782b2fc3a WatchSource:0}: Error finding container 609ce601719681a104fa7ac33f10da891268c99859b8e1203733635782b2fc3a: Status 404 returned error can't find the container with id 609ce601719681a104fa7ac33f10da891268c99859b8e1203733635782b2fc3a Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.089866 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.089891 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.089899 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.089914 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.090001 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.194247 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.194474 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.194485 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.194503 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.194513 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.264241 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.264352 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264389 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:04:56.264368172 +0000 UTC m=+87.954937509 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.264424 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264478 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264495 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.264506 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264517 4687 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.264527 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264559 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:56.264545796 +0000 UTC m=+87.955115133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264616 4687 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264648 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:56.264641747 +0000 UTC m=+87.955211084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264659 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264672 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264672 4687 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264683 4687 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264709 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:56.264698333 +0000 UTC m=+87.955267670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.264722 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:56.264717509 +0000 UTC m=+87.955286846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.297871 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.297908 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.297918 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.297930 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.297940 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.365738 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.365949 4687 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: E0228 09:04:55.366061 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs podName:8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:56.366041036 +0000 UTC m=+88.056610373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs") pod "network-metrics-daemon-7h597" (UID: "8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.400150 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.400186 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.400196 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.400210 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.400219 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.502227 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.502272 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.502283 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.502301 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.502312 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.603778 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.603814 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.603828 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.603843 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.603852 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.706483 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.706516 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.706525 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.706541 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.706550 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.808955 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.809155 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.809165 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.809182 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.809190 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.911227 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.911264 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.911272 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.911289 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.911297 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:55Z","lastTransitionTime":"2026-02-28T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.929497 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-85qxd" event={"ID":"8a50af8f-7793-4165-980f-140b2700d716","Type":"ContainerStarted","Data":"ed717180a08a4dc94278de55ebeedc983b52ca7fb942b41c168e46b6d13f4b29"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.929535 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-85qxd" event={"ID":"8a50af8f-7793-4165-980f-140b2700d716","Type":"ContainerStarted","Data":"7c087c2d2783f7c28535ecac11a2da44b63acae0b71b0c0e38f7b76c30c39fd5"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.930319 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"62ce5ffef48512e5468e6c531d11e56a83cd97db32babda5c03d57b5153e28fc"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.932705 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8rkhw" event={"ID":"8ee9f985-2783-4c64-913f-c471571a46a3","Type":"ContainerStarted","Data":"201fdaa3afc315e7615e0485f0fa4a8903fd0890ebeadae45599f1f4dd946034"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.932747 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8rkhw" event={"ID":"8ee9f985-2783-4c64-913f-c471571a46a3","Type":"ContainerStarted","Data":"ffd2b4445b393d74e165d16ef4cd4c7088ee85be84cc7463cd85b684b7f86c42"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.933833 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"475b877870847c25c990577da990f7bd9e99a76679aa57e817f1d78f3549ed7e"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.933871 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eaf2ef8bc2b25a30ca4259ef34aec8fe6d9862c6ab6a69e92d04b8a3cd46f9b0"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.935224 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" event={"ID":"7b042334-888e-40c1-92ea-72e4fe52be22","Type":"ContainerStarted","Data":"311f01f31cb502794ca20aa31c67f593cfad8fa0344cff8c09455c3489ed4a81"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.935274 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" event={"ID":"7b042334-888e-40c1-92ea-72e4fe52be22","Type":"ContainerStarted","Data":"942bae1486060c20a50ae50c9ca1f8d4cd6141af138a73bd09605fa69f0de3a4"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.935291 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" event={"ID":"7b042334-888e-40c1-92ea-72e4fe52be22","Type":"ContainerStarted","Data":"987f3233ce8c81abf3ac92e2ebb1b8720dd26acbd8baecac96c285b655e1e38a"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.936469 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ad4a2ec33965877f07781405b5b2d4692cfeb4d6e6761b626f7deb190936b65f"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.936500 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"54ee8e1e6dc9b347f82b7904614732b1102aae25b1bb8b0e2975fb7f69b5039c"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.936515 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"11c0b6d2e2756ce1a7c650b120b05519db8d1bf8d8f257bca82b26cf2c4db30d"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.937529 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595" exitCode=0 Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.937607 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.937642 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"99eb6843f3a9b1bb0df85a41197c4957994630f01196752b0dd5e9b8984d629a"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.939945 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"c3f1b77acec189e9d98cd9a4dde011ade5c1af7d389bfec50179735461b6f92d"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.939973 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.939986 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"8f4829a5b1c67827a1cf4b061d2886fe9279cffbce96d3a09dc6332e60cb2e64"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.941390 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2fb9df8-4328-4497-a2e3-707301840319" containerID="d94753b8326bf1392477f2a5dbb0f5bd1d0d57061743595319d117f191d323a1" exitCode=0 Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.941435 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerDied","Data":"d94753b8326bf1392477f2a5dbb0f5bd1d0d57061743595319d117f191d323a1"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.941454 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerStarted","Data":"2c51752648bad8fb8b0753371a301973a971df8d8b458f60e298719a8afbc53e"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.944228 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vw2n2" event={"ID":"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb","Type":"ContainerStarted","Data":"1e0ea093d6e5d71487b08d52a6457136c68f27f54f71034805700e165db95e45"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.944264 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vw2n2" event={"ID":"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb","Type":"ContainerStarted","Data":"609ce601719681a104fa7ac33f10da891268c99859b8e1203733635782b2fc3a"} Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.947858 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:55Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.964706 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sbkqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:55Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.975997 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:55Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.984837 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8rkhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee9f985-2783-4c64-913f-c471571a46a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jw8z8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8rkhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:55Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:55 crc kubenswrapper[4687]: I0228 09:04:55.994503 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:55Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.006538 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b042334-888e-40c1-92ea-72e4fe52be22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjtg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.012853 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.012882 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.012891 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.012902 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.012910 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.019925 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2fb9df8-4328-4497-a2e3-707301840319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x5r5v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.030349 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-85qxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a50af8f-7793-4165-980f-140b2700d716\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed717180a08a4dc94278de55ebeedc983b52ca7fb942b41c168e46b6d13f4b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjrwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-85qxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.038889 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vw2n2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxdzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vw2n2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.050549 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7h597" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5z2zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5z2zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7h597\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.061660 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.072427 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.083400 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.097572 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb29f6b-2e87-454b-966f-5202547e1b6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pxxbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.108292 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.115875 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.115902 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.115913 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.115928 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.115936 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.118435 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b042334-888e-40c1-92ea-72e4fe52be22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://942bae1486060c20a50ae50c9ca1f8d4cd6141af138a73bd09605fa69f0de3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311f01f31cb502794ca20aa31c67f593cfad8fa0344cff8c09455c3489ed4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjtg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.131828 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8rkhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee9f985-2783-4c64-913f-c471571a46a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201fdaa3afc315e7615e0485f0fa4a8903fd0890ebeadae45599f1f4dd946034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jw8z8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8rkhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.139270 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vw2n2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab3c8fdc-f423-43f7-b0d8-484490cdcfdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e0ea093d6e5d71487b08d52a6457136c68f27f54f71034805700e165db95e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxdzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vw2n2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.147166 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7h597" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5z2zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5z2zz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7h597\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.158995 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2fb9df8-4328-4497-a2e3-707301840319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94753b8326bf1392477f2a5dbb0f5bd1d0d57061743595319d117f191d323a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94753b8326bf1392477f2a5dbb0f5bd1d0d57061743595319d117f191d323a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x5r5v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.172305 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-85qxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a50af8f-7793-4165-980f-140b2700d716\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed717180a08a4dc94278de55ebeedc983b52ca7fb942b41c168e46b6d13f4b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjrwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-85qxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.182941 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4a2ec33965877f07781405b5b2d4692cfeb4d6e6761b626f7deb190936b65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ee8e1e6dc9b347f82b7904614732b1102aae25b1bb8b0e2975fb7f69b5039c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.209483 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.218242 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.218274 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.218283 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.218298 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.218307 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.237187 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb29f6b-2e87-454b-966f-5202547e1b6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pxxbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.248869 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.262214 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475b877870847c25c990577da990f7bd9e99a76679aa57e817f1d78f3549ed7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.273176 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.274712 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.274825 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.274852 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.274867 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:04:58.274847727 +0000 UTC m=+89.965417064 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.274933 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.274954 4687 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.274963 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.274996 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275009 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.274960 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275069 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275078 4687 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275085 4687 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.274997 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:58.274985446 +0000 UTC m=+89.965554784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275038 4687 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275136 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:58.275122203 +0000 UTC m=+89.965691531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275149 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:58.275143604 +0000 UTC m=+89.965712941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.275222 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:58.275197555 +0000 UTC m=+89.965766882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.281226 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f1b77acec189e9d98cd9a4dde011ade5c1af7d389bfec50179735461b6f92d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sbkqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.320007 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.320050 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.320061 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.320075 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.320088 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.375417 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.375534 4687 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.375580 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs podName:8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3 nodeName:}" failed. No retries permitted until 2026-02-28 09:04:58.375567467 +0000 UTC m=+90.066136805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs") pod "network-metrics-daemon-7h597" (UID: "8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.424529 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.424558 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.424567 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.424580 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.424588 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.526615 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.526642 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.526649 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.526660 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.526670 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.627885 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.627914 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.627923 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.627935 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.627949 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.656591 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.656681 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7h597" podUID="8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.656591 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.656750 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.656917 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.657139 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.657166 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:56 crc kubenswrapper[4687]: E0228 09:04:56.657403 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.659981 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.660579 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.661611 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.662181 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.663094 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.663563 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.664118 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.664973 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.665560 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.666419 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.666894 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.667850 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.668321 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.668775 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.669578 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.670106 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.670936 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.671323 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.671807 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.672680 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.673133 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.673962 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.674436 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.675335 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.675746 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.676304 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.677232 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.677646 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.678508 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.678961 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.679703 4687 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.679809 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.681233 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.682029 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.682427 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.683731 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.684419 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.685235 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.685789 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.686698 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.687324 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.687917 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.688539 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.689121 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.689550 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.690072 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.690572 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.691255 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.691709 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.692181 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.692608 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.693098 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.693595 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.694045 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.729881 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.729987 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.730176 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.730334 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.730487 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.831845 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.831877 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.831887 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.831901 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.831910 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.934136 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.934165 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.934177 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.934189 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.934197 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:56Z","lastTransitionTime":"2026-02-28T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.948423 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2fb9df8-4328-4497-a2e3-707301840319" containerID="883e5ffd4a8ec50875155dfa27a595855e7ba6e3f2dec0956d3ca48b6264a9a8" exitCode=0 Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.948487 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerDied","Data":"883e5ffd4a8ec50875155dfa27a595855e7ba6e3f2dec0956d3ca48b6264a9a8"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.949871 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a9c78ee3308a7acbecca51e97b75f6e8ae87001a3969684616f6bcb7489bc458"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.952795 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.952825 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.952836 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.952845 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.952856 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.952863 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524"} Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.961782 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.974378 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4a2ec33965877f07781405b5b2d4692cfeb4d6e6761b626f7deb190936b65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ee8e1e6dc9b347f82b7904614732b1102aae25b1bb8b0e2975fb7f69b5039c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:56 crc kubenswrapper[4687]: I0228 09:04:56.986665 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.001330 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fb29f6b-2e87-454b-966f-5202547e1b6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pqfcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pxxbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:56Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.010997 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://475b877870847c25c990577da990f7bd9e99a76679aa57e817f1d78f3549ed7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.019323 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.027573 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd48dfa-192a-4a5b-be30-fc7eebc90da1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f1b77acec189e9d98cd9a4dde011ade5c1af7d389bfec50179735461b6f92d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rk5s7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sbkqn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.035985 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.036012 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.036036 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.036049 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.036058 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.037261 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8rkhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ee9f985-2783-4c64-913f-c471571a46a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201fdaa3afc315e7615e0485f0fa4a8903fd0890ebeadae45599f1f4dd946034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jw8z8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8rkhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.054258 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.061844 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b042334-888e-40c1-92ea-72e4fe52be22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://942bae1486060c20a50ae50c9ca1f8d4cd6141af138a73bd09605fa69f0de3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311f01f31cb502794ca20aa31c67f593cfad8fa0344cff8c09455c3489ed4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g528j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gjtg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.071330 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2fb9df8-4328-4497-a2e3-707301840319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94753b8326bf1392477f2a5dbb0f5bd1d0d57061743595319d117f191d323a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94753b8326bf1392477f2a5dbb0f5bd1d0d57061743595319d117f191d323a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://883e5ffd4a8ec50875155dfa27a595855e7ba6e3f2dec0956d3ca48b6264a9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://883e5ffd4a8ec50875155dfa27a595855e7ba6e3f2dec0956d3ca48b6264a9a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T09:04:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T09:04:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbms9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x5r5v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.078463 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-85qxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a50af8f-7793-4165-980f-140b2700d716\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T09:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed717180a08a4dc94278de55ebeedc983b52ca7fb942b41c168e46b6d13f4b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjrwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T09:04:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-85qxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T09:04:57Z is after 2025-08-24T17:21:41Z" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.093860 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vw2n2" podStartSLOduration=60.093846219 podStartE2EDuration="1m0.093846219s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:04:57.093770976 +0000 UTC m=+88.784340314" watchObservedRunningTime="2026-02-28 09:04:57.093846219 +0000 UTC m=+88.784415566" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.120785 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8rkhw" podStartSLOduration=60.120771678 podStartE2EDuration="1m0.120771678s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:04:57.112535422 +0000 UTC m=+88.803104769" watchObservedRunningTime="2026-02-28 09:04:57.120771678 +0000 UTC m=+88.811341015" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.138308 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.138339 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.138348 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.138367 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.138375 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.143195 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gjtg2" podStartSLOduration=60.143184586 podStartE2EDuration="1m0.143184586s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:04:57.130988149 +0000 UTC m=+88.821557496" watchObservedRunningTime="2026-02-28 09:04:57.143184586 +0000 UTC m=+88.833753924" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.159216 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-85qxd" podStartSLOduration=60.159199658 podStartE2EDuration="1m0.159199658s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:04:57.150296026 +0000 UTC m=+88.840865364" watchObservedRunningTime="2026-02-28 09:04:57.159199658 +0000 UTC m=+88.849768996" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.232005 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podStartSLOduration=60.231988317 podStartE2EDuration="1m0.231988317s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:04:57.230866246 +0000 UTC m=+88.921435582" watchObservedRunningTime="2026-02-28 09:04:57.231988317 +0000 UTC m=+88.922557644" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.240580 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.240611 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.240620 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.240632 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.240640 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.343429 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.343466 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.343476 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.343490 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.343499 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.446236 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.446277 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.446286 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.446301 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.446311 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.548943 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.548988 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.548997 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.549013 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.549043 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.651278 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.651321 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.651332 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.651348 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.651358 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.753469 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.753503 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.753512 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.753533 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.753543 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.856384 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.856425 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.856433 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.856448 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.856458 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957381 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2fb9df8-4328-4497-a2e3-707301840319" containerID="a0a27a69dd5d23b8cf7e6f9e4100695dd30b6bc29efa3fa7a2e04209784b1e32" exitCode=0 Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957426 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerDied","Data":"a0a27a69dd5d23b8cf7e6f9e4100695dd30b6bc29efa3fa7a2e04209784b1e32"} Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957738 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957775 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957786 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957800 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:57 crc kubenswrapper[4687]: I0228 09:04:57.957809 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:57Z","lastTransitionTime":"2026-02-28T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.060123 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.060184 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.060194 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.060216 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.060228 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.165805 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.165854 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.165865 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.165883 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.165893 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.268336 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.268374 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.268385 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.268399 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.268408 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.291769 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.291843 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.291868 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.291952 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:02.291926256 +0000 UTC m=+93.982495594 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.291968 4687 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292005 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292054 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292070 4687 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.292009 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292108 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292130 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292142 4687 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292032 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:02.292005064 +0000 UTC m=+93.982574402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292182 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:02.292174263 +0000 UTC m=+93.982743600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.292197 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292253 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:02.292247721 +0000 UTC m=+93.982817059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292269 4687 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.292327 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:02.29231109 +0000 UTC m=+93.982880427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.370961 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.370995 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.371003 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.371036 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.371049 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.393495 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.393652 4687 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.393715 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs podName:8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:02.393701011 +0000 UTC m=+94.084270358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs") pod "network-metrics-daemon-7h597" (UID: "8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.473437 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.473481 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.473490 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.473505 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.473514 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.575936 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.575970 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.575979 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.575994 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.576003 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.656315 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.656325 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.656370 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.656954 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7h597" podUID="8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.656981 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.657107 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.657211 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 09:04:58 crc kubenswrapper[4687]: E0228 09:04:58.657326 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.678182 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.678216 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.678227 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.678240 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.678250 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.780116 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.780144 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.780152 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.780162 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.780172 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.884487 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.884521 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.884531 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.884548 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.884558 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.963062 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.965151 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2fb9df8-4328-4497-a2e3-707301840319" containerID="51c58d0f7d10e9c11582d6668102c29e8907d7dc44b123d7968168d2b2620cfd" exitCode=0 Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.965192 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerDied","Data":"51c58d0f7d10e9c11582d6668102c29e8907d7dc44b123d7968168d2b2620cfd"} Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.986089 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.986141 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.986152 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.986164 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:58 crc kubenswrapper[4687]: I0228 09:04:58.986173 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:58Z","lastTransitionTime":"2026-02-28T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.088671 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.088719 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.088733 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.088754 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.088779 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.190491 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.190752 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.190773 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.190790 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.190800 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.295510 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.295548 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.295557 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.295571 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.295581 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.397330 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.397358 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.397366 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.397377 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.397384 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.502386 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.502420 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.502429 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.502440 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.502449 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.604777 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.604805 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.604813 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.604825 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.604834 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.706536 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.706562 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.706569 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.706578 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.706586 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.808435 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.808465 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.808474 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.808484 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.808492 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.910608 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.910643 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.910653 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.910667 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.910677 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:04:59Z","lastTransitionTime":"2026-02-28T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.975110 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2fb9df8-4328-4497-a2e3-707301840319" containerID="5beca11df0feb6e8e86842059da1aa945b5d68ab84cc52f037c537b6d3c99683" exitCode=0 Feb 28 09:04:59 crc kubenswrapper[4687]: I0228 09:04:59.975185 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerDied","Data":"5beca11df0feb6e8e86842059da1aa945b5d68ab84cc52f037c537b6d3c99683"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.012559 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.012596 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.012605 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.012619 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.012628 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.115102 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.115128 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.115137 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.115148 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.115156 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.217750 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.217804 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.217814 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.217834 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.217844 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.321124 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.321168 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.321182 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.321200 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.321210 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.423403 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.423442 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.423451 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.423466 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.423476 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.525100 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.525140 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.525148 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.525163 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.525172 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.627238 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.627467 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.627480 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.627497 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.627506 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.656050 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.656083 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.656047 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.656183 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:00 crc kubenswrapper[4687]: E0228 09:05:00.656303 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 09:05:00 crc kubenswrapper[4687]: E0228 09:05:00.656466 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 09:05:00 crc kubenswrapper[4687]: E0228 09:05:00.656507 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7h597" podUID="8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3" Feb 28 09:05:00 crc kubenswrapper[4687]: E0228 09:05:00.656573 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.668329 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:05:00 crc kubenswrapper[4687]: E0228 09:05:00.668566 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.669338 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.730629 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.730670 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.730681 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.730697 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.730708 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.833457 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.833494 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.833503 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.833518 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.833529 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.935912 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.935943 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.935951 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.935962 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.935971 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:00Z","lastTransitionTime":"2026-02-28T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.979982 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2fb9df8-4328-4497-a2e3-707301840319" containerID="247be8f4be2f50e366e3fb9a7116908ac358197b1799478573af5c0c1e67bc99" exitCode=0 Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.980009 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerDied","Data":"247be8f4be2f50e366e3fb9a7116908ac358197b1799478573af5c0c1e67bc99"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.985310 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:05:00 crc kubenswrapper[4687]: E0228 09:05:00.985435 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.985503 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerStarted","Data":"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886"} Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.985806 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.985832 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:05:00 crc kubenswrapper[4687]: I0228 09:05:00.985842 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.009499 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.010262 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.040070 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.040106 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.040118 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.040137 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.040148 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.041859 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podStartSLOduration=64.041847267 podStartE2EDuration="1m4.041847267s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:01.041708105 +0000 UTC m=+92.732277452" watchObservedRunningTime="2026-02-28 09:05:01.041847267 +0000 UTC m=+92.732416604" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.142137 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.142183 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.142200 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.142216 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.142226 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.244950 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.244986 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.244994 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.245009 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.245036 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.347807 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.347866 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.347876 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.347899 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.347912 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.450530 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.450982 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.450996 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.451034 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.451047 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.553710 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.553747 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.553759 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.553786 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.553795 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.656215 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.656246 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.656256 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.656267 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.656274 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.758502 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.758548 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.758559 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.758577 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.758588 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.862154 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.862211 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.862225 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.862250 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.862264 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.964160 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.964208 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.964219 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.964235 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.964245 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:01Z","lastTransitionTime":"2026-02-28T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:01 crc kubenswrapper[4687]: I0228 09:05:01.990754 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" event={"ID":"e2fb9df8-4328-4497-a2e3-707301840319","Type":"ContainerStarted","Data":"9b1e74cdc5021330691a5239f4a85c4e9d2e8b4f2baba38dbf9ca1aa658c5cbf"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.014467 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x5r5v" podStartSLOduration=65.014436543 podStartE2EDuration="1m5.014436543s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:02.010603322 +0000 UTC m=+93.701172669" watchObservedRunningTime="2026-02-28 09:05:02.014436543 +0000 UTC m=+93.705005880" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.066952 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.067004 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.067039 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.067063 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.067077 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.169042 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.169091 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.169101 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.169119 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.169131 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.271927 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.271981 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.271992 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.272015 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.272069 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.331903 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.331872705 +0000 UTC m=+102.022442043 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.331752 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.332337 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.332363 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.332379 4687 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.332434 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.332426779 +0000 UTC m=+102.022996115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.332141 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.332988 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333109 4687 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333155 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.333147645 +0000 UTC m=+102.023716972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.333118 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.333195 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333263 4687 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333357 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.333335789 +0000 UTC m=+102.023905126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333277 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333409 4687 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333428 4687 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.333463 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.333456717 +0000 UTC m=+102.024026053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.374281 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.374318 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.374328 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.374347 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.374359 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.381877 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7h597"] Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.382085 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.382222 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7h597" podUID="8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.434278 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.434480 4687 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.434598 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs podName:8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.434571609 +0000 UTC m=+102.125140956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs") pod "network-metrics-daemon-7h597" (UID: "8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.476371 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.476415 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.476424 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.476443 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.476454 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.579388 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.579464 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.579481 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.579506 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.579522 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.656411 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.656481 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.656535 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.656547 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.656619 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 09:05:02 crc kubenswrapper[4687]: E0228 09:05:02.656676 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.682289 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.682330 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.682338 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.682353 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.682368 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.784587 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.784635 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.784645 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.784659 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.784669 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.887995 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.888053 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.888062 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.888080 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.888089 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.990964 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.991008 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.991033 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.991049 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:02 crc kubenswrapper[4687]: I0228 09:05:02.991060 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:02Z","lastTransitionTime":"2026-02-28T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.093297 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.093340 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.093350 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.093365 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.093375 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.196118 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.196215 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.196233 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.196256 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.196273 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.298232 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.298273 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.298283 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.298297 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.298307 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.401057 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.401094 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.401106 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.401122 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.401131 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.502870 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.502911 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.502923 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.502938 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.502949 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.604621 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.604648 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.604657 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.604669 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.604679 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.613825 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.613864 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.613875 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.613886 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.613896 4687 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T09:05:03Z","lastTransitionTime":"2026-02-28T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.660201 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb"] Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.660652 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.662567 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.662783 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.663090 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.665235 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.667890 4687 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.675086 4687 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.748221 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a287f204-91ff-4f0e-90f8-2e7ff4547091-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.748269 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a287f204-91ff-4f0e-90f8-2e7ff4547091-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.748296 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a287f204-91ff-4f0e-90f8-2e7ff4547091-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.748315 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a287f204-91ff-4f0e-90f8-2e7ff4547091-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.748335 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a287f204-91ff-4f0e-90f8-2e7ff4547091-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849475 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a287f204-91ff-4f0e-90f8-2e7ff4547091-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849518 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a287f204-91ff-4f0e-90f8-2e7ff4547091-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849540 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a287f204-91ff-4f0e-90f8-2e7ff4547091-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849559 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a287f204-91ff-4f0e-90f8-2e7ff4547091-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849612 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a287f204-91ff-4f0e-90f8-2e7ff4547091-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849643 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a287f204-91ff-4f0e-90f8-2e7ff4547091-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.849674 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a287f204-91ff-4f0e-90f8-2e7ff4547091-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.850758 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a287f204-91ff-4f0e-90f8-2e7ff4547091-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.856042 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a287f204-91ff-4f0e-90f8-2e7ff4547091-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.871280 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a287f204-91ff-4f0e-90f8-2e7ff4547091-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k9wzb\" (UID: \"a287f204-91ff-4f0e-90f8-2e7ff4547091\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.971934 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" Feb 28 09:05:03 crc kubenswrapper[4687]: I0228 09:05:03.997560 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" event={"ID":"a287f204-91ff-4f0e-90f8-2e7ff4547091","Type":"ContainerStarted","Data":"dd039048e339b5fcf88a49702aaf28c1079fcefa52393f7703453285f864cf4c"} Feb 28 09:05:04 crc kubenswrapper[4687]: I0228 09:05:04.656828 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:04 crc kubenswrapper[4687]: I0228 09:05:04.656843 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:04 crc kubenswrapper[4687]: I0228 09:05:04.657003 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:04 crc kubenswrapper[4687]: E0228 09:05:04.657102 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7h597" podUID="8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3" Feb 28 09:05:04 crc kubenswrapper[4687]: I0228 09:05:04.657122 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:04 crc kubenswrapper[4687]: E0228 09:05:04.657222 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 09:05:04 crc kubenswrapper[4687]: E0228 09:05:04.657251 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 09:05:04 crc kubenswrapper[4687]: E0228 09:05:04.657355 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.003051 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" event={"ID":"a287f204-91ff-4f0e-90f8-2e7ff4547091","Type":"ContainerStarted","Data":"950a3f5d2ced524dc98a35e7226db9762d729727704aedab229671942cd72945"} Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.015566 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k9wzb" podStartSLOduration=68.01554317 podStartE2EDuration="1m8.01554317s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:05.014699391 +0000 UTC m=+96.705268738" watchObservedRunningTime="2026-02-28 09:05:05.01554317 +0000 UTC m=+96.706112507" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.250403 4687 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.250632 4687 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.281506 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.282065 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.283562 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6df9f"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.283965 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.291760 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bqdqx"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.291898 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.292222 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.292461 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.292729 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.292761 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.292797 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.293114 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.293187 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.293449 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.294067 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.294257 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.302467 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.302610 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.303253 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.303490 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9thbt"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.303817 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4m8kh"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.303849 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.303878 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304072 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tx86n"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304307 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304404 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304459 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304700 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304723 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.304879 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zhdhr"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.305252 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.305334 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.305378 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.306169 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.306641 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.306648 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.306908 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.306959 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.307108 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8vhfl"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.307705 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.308116 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.308128 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rsgcf"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.308430 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.308949 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.309298 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.309508 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.309953 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.310286 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.310525 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.310669 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9h5k"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.310961 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.311135 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.311381 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.311456 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.312533 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zrtwj"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.313269 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.313507 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5grmr"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.313896 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.313929 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.313968 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.314113 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.314415 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.314492 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.314610 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.316415 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.316948 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.322395 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.322549 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.323203 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.329108 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.330038 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dvw6x"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.330903 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.332503 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.332668 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.333133 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.333800 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.336946 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.352780 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.353622 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.353881 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.353969 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qns64"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.354659 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.354731 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.354786 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357186 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357278 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357302 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357381 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357454 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357475 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357536 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357590 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357637 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357473 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357663 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357696 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357760 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357863 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357886 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357894 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357927 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357996 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358032 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358077 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358117 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358160 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358203 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358255 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358276 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358122 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358382 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358396 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358412 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358447 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358499 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358513 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358541 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358586 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358639 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358651 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358707 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358726 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358734 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358784 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358826 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358835 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358894 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358909 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358928 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358585 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359003 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359010 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358965 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359146 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357777 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358830 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.357997 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.358541 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359423 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359660 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359691 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.359828 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.360059 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.360164 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.360311 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.363546 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365325 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365478 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365584 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365799 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32938de1-4583-4038-8ced-e6bc1327911a-auth-proxy-config\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365830 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/32938de1-4583-4038-8ced-e6bc1327911a-machine-approver-tls\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365841 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365853 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-trusted-ca\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365873 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9v5\" (UniqueName: \"kubernetes.io/projected/32938de1-4583-4038-8ced-e6bc1327911a-kube-api-access-4j9v5\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365916 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32938de1-4583-4038-8ced-e6bc1327911a-config\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365939 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.365938 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84z6r\" (UniqueName: \"kubernetes.io/projected/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-kube-api-access-84z6r\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.366052 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.366063 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-config\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.366120 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-serving-cert\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.366430 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.367589 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.368284 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.368465 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.368729 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.368996 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6df9f"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.370207 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.370212 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.370359 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5xt25"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.370734 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.370962 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q26sq"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.371304 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.372216 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.372278 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.372845 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.373694 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.374044 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.377406 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.378086 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.384036 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.386122 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.388992 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb8xd"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.389901 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2kghk"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.390788 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.391817 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.395209 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.402388 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.402414 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9thbt"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.402428 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4m8kh"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.402504 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.403264 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.403884 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.404602 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.404685 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.405160 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.406637 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.408189 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.409168 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bqdqx"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.409414 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dvw6x"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.411922 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.415226 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zhdhr"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.415984 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5grmr"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.417090 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9h5k"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.418186 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.420424 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.421229 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.423926 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8vhfl"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.424319 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.425976 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.427255 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tx86n"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.429039 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.429061 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.430249 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.431438 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.431513 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rsgcf"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.433134 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.433353 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.434833 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.434858 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.435732 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.436090 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.436740 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.437454 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.438151 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.438805 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.439852 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qns64"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.440508 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5xt25"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.440910 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-26znf"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.444206 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-47grc"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.444928 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q26sq"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.444991 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.445039 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.446390 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.448081 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-26znf"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.449098 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-47grc"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.451675 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.464008 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.466593 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-config\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467318 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-config\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467328 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-client-ca\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467429 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-serving-cert\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467467 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467491 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-748sw\" (UniqueName: \"kubernetes.io/projected/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-kube-api-access-748sw\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467515 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-proxy-tls\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467533 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/941eced7-a875-42d3-91a4-d36f770b30a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467577 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599bw\" (UniqueName: \"kubernetes.io/projected/4bfbdc6f-2078-4dee-b253-d7658f4e839d-kube-api-access-599bw\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467644 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/32938de1-4583-4038-8ced-e6bc1327911a-machine-approver-tls\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467694 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467850 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-trusted-ca\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467943 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a47bc793-deb1-42d1-9759-42c79b7ef053-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467974 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-config\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.467997 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-service-ca-bundle\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468046 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-ca\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468093 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4lw\" (UniqueName: \"kubernetes.io/projected/eaa6a825-72b4-4544-9e19-5af6b2c7648e-kube-api-access-vr4lw\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468109 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-images\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468129 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a47bc793-deb1-42d1-9759-42c79b7ef053-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468147 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrj9\" (UniqueName: \"kubernetes.io/projected/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-kube-api-access-jtrj9\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468162 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-certs\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468180 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6840e9-1b32-4e33-aa8c-31285246df48-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468203 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32938de1-4583-4038-8ced-e6bc1327911a-config\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468220 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468235 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-serving-cert\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468257 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468273 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-webhook-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468294 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-client-ca\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468311 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a47bc793-deb1-42d1-9759-42c79b7ef053-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468326 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-service-ca\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468345 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjd2\" (UniqueName: \"kubernetes.io/projected/c3d794dc-474f-4572-8227-60bc4a41c69e-kube-api-access-mgjd2\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468372 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gxcf\" (UniqueName: \"kubernetes.io/projected/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-kube-api-access-7gxcf\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468388 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083e169f-d8ba-4454-a2e4-84587ae7551c-serving-cert\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468408 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87d5p\" (UniqueName: \"kubernetes.io/projected/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-kube-api-access-87d5p\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468427 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-srv-cert\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468443 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d794dc-474f-4572-8227-60bc4a41c69e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468460 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01c22693-ad9c-426f-8ae6-ac335c7cbca1-metrics-tls\") pod \"dns-operator-744455d44c-5grmr\" (UID: \"01c22693-ad9c-426f-8ae6-ac335c7cbca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468479 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhnwc\" (UniqueName: \"kubernetes.io/projected/083e169f-d8ba-4454-a2e4-84587ae7551c-kube-api-access-xhnwc\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468504 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-config\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468518 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-client\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468536 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468555 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wxsl\" (UniqueName: \"kubernetes.io/projected/04e92ab6-5602-4d97-9e70-f95ff9769a79-kube-api-access-6wxsl\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468574 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c6840e9-1b32-4e33-aa8c-31285246df48-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468593 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32938de1-4583-4038-8ced-e6bc1327911a-auth-proxy-config\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468609 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-node-bootstrap-token\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468627 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/941eced7-a875-42d3-91a4-d36f770b30a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468645 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa6a825-72b4-4544-9e19-5af6b2c7648e-serving-cert\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468658 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-trusted-ca\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468661 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468712 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e92ab6-5602-4d97-9e70-f95ff9769a79-serving-cert\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468734 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3d794dc-474f-4572-8227-60bc4a41c69e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468750 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468781 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9v5\" (UniqueName: \"kubernetes.io/projected/32938de1-4583-4038-8ced-e6bc1327911a-kube-api-access-4j9v5\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468799 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b45242a-b238-4814-b6fa-f22a62c5907f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468819 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-config\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468834 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468850 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ccr7\" (UniqueName: \"kubernetes.io/projected/01c22693-ad9c-426f-8ae6-ac335c7cbca1-kube-api-access-6ccr7\") pod \"dns-operator-744455d44c-5grmr\" (UID: \"01c22693-ad9c-426f-8ae6-ac335c7cbca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468881 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468899 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sp8t\" (UniqueName: \"kubernetes.io/projected/3b45242a-b238-4814-b6fa-f22a62c5907f-kube-api-access-6sp8t\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468915 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468930 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4bfbdc6f-2078-4dee-b253-d7658f4e839d-tmpfs\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468950 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-config\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468966 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6840e9-1b32-4e33-aa8c-31285246df48-config\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468984 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84z6r\" (UniqueName: \"kubernetes.io/projected/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-kube-api-access-84z6r\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.468999 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.469015 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3b45242a-b238-4814-b6fa-f22a62c5907f-ready\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.469048 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6pb\" (UniqueName: \"kubernetes.io/projected/a47bc793-deb1-42d1-9759-42c79b7ef053-kube-api-access-mj6pb\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.469066 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blgwk\" (UniqueName: \"kubernetes.io/projected/4aa62668-b8d2-4c84-a890-c91f79aae6e6-kube-api-access-blgwk\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.469112 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941eced7-a875-42d3-91a4-d36f770b30a6-config\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.469322 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32938de1-4583-4038-8ced-e6bc1327911a-config\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.469836 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32938de1-4583-4038-8ced-e6bc1327911a-auth-proxy-config\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.470935 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-serving-cert\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.471727 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/32938de1-4583-4038-8ced-e6bc1327911a-machine-approver-tls\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.485123 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.501373 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kvzpk"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.502398 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.504133 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.507101 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kvzpk"] Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.524602 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.544858 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.564328 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570319 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gxcf\" (UniqueName: \"kubernetes.io/projected/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-kube-api-access-7gxcf\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570357 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083e169f-d8ba-4454-a2e4-84587ae7551c-serving-cert\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570384 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d794dc-474f-4572-8227-60bc4a41c69e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570409 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01c22693-ad9c-426f-8ae6-ac335c7cbca1-metrics-tls\") pod \"dns-operator-744455d44c-5grmr\" (UID: \"01c22693-ad9c-426f-8ae6-ac335c7cbca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570431 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87d5p\" (UniqueName: \"kubernetes.io/projected/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-kube-api-access-87d5p\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570450 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-srv-cert\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570468 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhnwc\" (UniqueName: \"kubernetes.io/projected/083e169f-d8ba-4454-a2e4-84587ae7551c-kube-api-access-xhnwc\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570491 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-config\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570509 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-client\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570527 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570544 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wxsl\" (UniqueName: \"kubernetes.io/projected/04e92ab6-5602-4d97-9e70-f95ff9769a79-kube-api-access-6wxsl\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570562 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c6840e9-1b32-4e33-aa8c-31285246df48-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570584 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-node-bootstrap-token\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570603 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/941eced7-a875-42d3-91a4-d36f770b30a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570621 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570641 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa6a825-72b4-4544-9e19-5af6b2c7648e-serving-cert\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570661 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570679 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e92ab6-5602-4d97-9e70-f95ff9769a79-serving-cert\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570697 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3d794dc-474f-4572-8227-60bc4a41c69e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570724 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b45242a-b238-4814-b6fa-f22a62c5907f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570743 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-config\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570785 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570804 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ccr7\" (UniqueName: \"kubernetes.io/projected/01c22693-ad9c-426f-8ae6-ac335c7cbca1-kube-api-access-6ccr7\") pod \"dns-operator-744455d44c-5grmr\" (UID: \"01c22693-ad9c-426f-8ae6-ac335c7cbca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570835 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570856 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sp8t\" (UniqueName: \"kubernetes.io/projected/3b45242a-b238-4814-b6fa-f22a62c5907f-kube-api-access-6sp8t\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570874 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570890 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4bfbdc6f-2078-4dee-b253-d7658f4e839d-tmpfs\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570908 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-config\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570924 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6840e9-1b32-4e33-aa8c-31285246df48-config\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570942 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941eced7-a875-42d3-91a4-d36f770b30a6-config\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570967 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.570983 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3b45242a-b238-4814-b6fa-f22a62c5907f-ready\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571001 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6pb\" (UniqueName: \"kubernetes.io/projected/a47bc793-deb1-42d1-9759-42c79b7ef053-kube-api-access-mj6pb\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571042 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blgwk\" (UniqueName: \"kubernetes.io/projected/4aa62668-b8d2-4c84-a890-c91f79aae6e6-kube-api-access-blgwk\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571068 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-client-ca\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571087 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571105 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-748sw\" (UniqueName: \"kubernetes.io/projected/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-kube-api-access-748sw\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571130 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-proxy-tls\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571158 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/941eced7-a875-42d3-91a4-d36f770b30a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571178 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599bw\" (UniqueName: \"kubernetes.io/projected/4bfbdc6f-2078-4dee-b253-d7658f4e839d-kube-api-access-599bw\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571198 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571216 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-config\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571234 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-service-ca-bundle\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571254 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a47bc793-deb1-42d1-9759-42c79b7ef053-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571276 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-ca\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571306 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrj9\" (UniqueName: \"kubernetes.io/projected/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-kube-api-access-jtrj9\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571326 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-certs\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571344 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4lw\" (UniqueName: \"kubernetes.io/projected/eaa6a825-72b4-4544-9e19-5af6b2c7648e-kube-api-access-vr4lw\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571361 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-images\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571381 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a47bc793-deb1-42d1-9759-42c79b7ef053-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571400 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6840e9-1b32-4e33-aa8c-31285246df48-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571420 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571437 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-serving-cert\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571459 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571479 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-webhook-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571501 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-service-ca\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571521 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjd2\" (UniqueName: \"kubernetes.io/projected/c3d794dc-474f-4572-8227-60bc4a41c69e-kube-api-access-mgjd2\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571543 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-client-ca\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571561 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a47bc793-deb1-42d1-9759-42c79b7ef053-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.571971 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b45242a-b238-4814-b6fa-f22a62c5907f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.572639 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-config\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.573097 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.573267 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-service-ca-bundle\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.573440 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-config\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.573547 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4bfbdc6f-2078-4dee-b253-d7658f4e839d-tmpfs\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.573610 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/083e169f-d8ba-4454-a2e4-84587ae7551c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.574055 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.574271 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.574470 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-client-ca\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.575189 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-client-ca\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.575295 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a47bc793-deb1-42d1-9759-42c79b7ef053-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.576156 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3b45242a-b238-4814-b6fa-f22a62c5907f-ready\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.576449 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.576795 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a47bc793-deb1-42d1-9759-42c79b7ef053-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.577054 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-config\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.577410 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083e169f-d8ba-4454-a2e4-84587ae7551c-serving-cert\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.579135 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa6a825-72b4-4544-9e19-5af6b2c7648e-serving-cert\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.579163 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e92ab6-5602-4d97-9e70-f95ff9769a79-serving-cert\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.584913 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.604155 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.624124 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.644869 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.664054 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.683937 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.703870 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.724319 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.744699 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.768414 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.775951 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01c22693-ad9c-426f-8ae6-ac335c7cbca1-metrics-tls\") pod \"dns-operator-744455d44c-5grmr\" (UID: \"01c22693-ad9c-426f-8ae6-ac335c7cbca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.784797 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.804379 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.824171 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.836286 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3d794dc-474f-4572-8227-60bc4a41c69e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.843819 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.852853 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d794dc-474f-4572-8227-60bc4a41c69e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.865900 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.884367 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.903649 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.923921 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.944224 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.955113 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.964607 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 28 09:05:05 crc kubenswrapper[4687]: I0228 09:05:05.986134 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.003674 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.024159 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.044682 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.063508 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.077996 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6840e9-1b32-4e33-aa8c-31285246df48-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.083900 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.105180 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.125133 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.144395 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.171618 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.183847 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.193243 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6840e9-1b32-4e33-aa8c-31285246df48-config\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.204426 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.215533 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-client\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.225456 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.244576 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.263961 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.283632 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.284329 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-service-ca\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.304051 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.312678 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/941eced7-a875-42d3-91a4-d36f770b30a6-config\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.324078 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.344544 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.357615 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-serving-cert\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.362520 4687 request.go:700] Waited for 1.010728444s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Detcd-ca-bundle&limit=500&resourceVersion=0 Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.364156 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.375488 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-etcd-ca\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.384637 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.394729 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/941eced7-a875-42d3-91a4-d36f770b30a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.403826 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.413607 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-config\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.424554 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.445250 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.456695 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-srv-cert\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.464795 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.485211 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.497590 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.505038 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.514983 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.523947 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.543831 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.564177 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.573044 4687 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.573065 4687 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.573117 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist podName:3b45242a-b238-4814-b6fa-f22a62c5907f nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.073096902 +0000 UTC m=+98.763666239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-jb8xd" (UID: "3b45242a-b238-4814-b6fa-f22a62c5907f") : failed to sync configmap cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.573142 4687 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.573188 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-apiservice-cert podName:4bfbdc6f-2078-4dee-b253-d7658f4e839d nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.073152526 +0000 UTC m=+98.763721874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-apiservice-cert") pod "packageserver-d55dfcdfc-w9g6h" (UID: "4bfbdc6f-2078-4dee-b253-d7658f4e839d") : failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.573261 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-node-bootstrap-token podName:4aa62668-b8d2-4c84-a890-c91f79aae6e6 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.073229621 +0000 UTC m=+98.763798959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-node-bootstrap-token") pod "machine-config-server-2kghk" (UID: "4aa62668-b8d2-4c84-a890-c91f79aae6e6") : failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574710 4687 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574722 4687 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574748 4687 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574765 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-certs podName:4aa62668-b8d2-4c84-a890-c91f79aae6e6 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.074752426 +0000 UTC m=+98.765321764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-certs") pod "machine-config-server-2kghk" (UID: "4aa62668-b8d2-4c84-a890-c91f79aae6e6") : failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574805 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-images podName:5aacd998-f8bd-49e1-8d54-a4775c7e1f83 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.074792623 +0000 UTC m=+98.765361960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-images") pod "machine-config-operator-74547568cd-4j8df" (UID: "5aacd998-f8bd-49e1-8d54-a4775c7e1f83") : failed to sync configmap cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574806 4687 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574839 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-proxy-tls podName:5aacd998-f8bd-49e1-8d54-a4775c7e1f83 nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.074813732 +0000 UTC m=+98.765383069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-proxy-tls") pod "machine-config-operator-74547568cd-4j8df" (UID: "5aacd998-f8bd-49e1-8d54-a4775c7e1f83") : failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: E0228 09:05:06.574864 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-webhook-cert podName:4bfbdc6f-2078-4dee-b253-d7658f4e839d nodeName:}" failed. No retries permitted until 2026-02-28 09:05:07.074856493 +0000 UTC m=+98.765425829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-webhook-cert") pod "packageserver-d55dfcdfc-w9g6h" (UID: "4bfbdc6f-2078-4dee-b253-d7658f4e839d") : failed to sync secret cache: timed out waiting for the condition Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.584956 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.603961 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.644208 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.656387 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.656429 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.656386 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.656568 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.665500 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.683833 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.704045 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.724419 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.744737 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.770435 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.784774 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.805525 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.825046 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.844606 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.864252 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.884103 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.903661 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.923783 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.944211 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.963853 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 28 09:05:06 crc kubenswrapper[4687]: I0228 09:05:06.984082 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.003709 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.024311 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.043908 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.064603 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.084609 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091122 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-proxy-tls\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091202 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-images\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091246 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-certs\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091269 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-webhook-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091384 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-node-bootstrap-token\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091430 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.091468 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.092088 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-images\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.092208 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.096210 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-apiservice-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.096465 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-node-bootstrap-token\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.096531 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-proxy-tls\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.096895 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bfbdc6f-2078-4dee-b253-d7658f4e839d-webhook-cert\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.105628 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.114912 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4aa62668-b8d2-4c84-a890-c91f79aae6e6-certs\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.124387 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.164368 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.183512 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.203936 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.224123 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.243509 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.263686 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.284411 4687 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.315803 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84z6r\" (UniqueName: \"kubernetes.io/projected/4553b1b1-a202-4e03-8d3a-bdb2eb6042d1-kube-api-access-84z6r\") pod \"console-operator-58897d9998-6df9f\" (UID: \"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1\") " pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.335366 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9v5\" (UniqueName: \"kubernetes.io/projected/32938de1-4583-4038-8ced-e6bc1327911a-kube-api-access-4j9v5\") pod \"machine-approver-56656f9798-lmsz6\" (UID: \"32938de1-4583-4038-8ced-e6bc1327911a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.344312 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.363448 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.382586 4687 request.go:700] Waited for 1.879930111s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.384405 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.415964 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gxcf\" (UniqueName: \"kubernetes.io/projected/80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0-kube-api-access-7gxcf\") pod \"catalog-operator-68c6474976-4xptn\" (UID: \"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.427118 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.438992 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87d5p\" (UniqueName: \"kubernetes.io/projected/5aacd998-f8bd-49e1-8d54-a4775c7e1f83-kube-api-access-87d5p\") pod \"machine-config-operator-74547568cd-4j8df\" (UID: \"5aacd998-f8bd-49e1-8d54-a4775c7e1f83\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.456650 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.458898 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a47bc793-deb1-42d1-9759-42c79b7ef053-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.477620 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599bw\" (UniqueName: \"kubernetes.io/projected/4bfbdc6f-2078-4dee-b253-d7658f4e839d-kube-api-access-599bw\") pod \"packageserver-d55dfcdfc-w9g6h\" (UID: \"4bfbdc6f-2078-4dee-b253-d7658f4e839d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.496657 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/defaa3ff-8d11-49b1-a9d4-7f54a0650d0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-stsx5\" (UID: \"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.515220 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wxsl\" (UniqueName: \"kubernetes.io/projected/04e92ab6-5602-4d97-9e70-f95ff9769a79-kube-api-access-6wxsl\") pod \"route-controller-manager-6576b87f9c-4f5xw\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.537280 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c6840e9-1b32-4e33-aa8c-31285246df48-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p4ft4\" (UID: \"9c6840e9-1b32-4e33-aa8c-31285246df48\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.551644 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.558934 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/941eced7-a875-42d3-91a4-d36f770b30a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-s2d57\" (UID: \"941eced7-a875-42d3-91a4-d36f770b30a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.577303 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ccr7\" (UniqueName: \"kubernetes.io/projected/01c22693-ad9c-426f-8ae6-ac335c7cbca1-kube-api-access-6ccr7\") pod \"dns-operator-744455d44c-5grmr\" (UID: \"01c22693-ad9c-426f-8ae6-ac335c7cbca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.596600 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.598990 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sp8t\" (UniqueName: \"kubernetes.io/projected/3b45242a-b238-4814-b6fa-f22a62c5907f-kube-api-access-6sp8t\") pod \"cni-sysctl-allowlist-ds-jb8xd\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.611058 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6df9f"] Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.617927 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjd2\" (UniqueName: \"kubernetes.io/projected/c3d794dc-474f-4572-8227-60bc4a41c69e-kube-api-access-mgjd2\") pod \"kube-storage-version-migrator-operator-b67b599dd-7tgnq\" (UID: \"c3d794dc-474f-4572-8227-60bc4a41c69e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.621759 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.632963 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.636626 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrj9\" (UniqueName: \"kubernetes.io/projected/317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0-kube-api-access-jtrj9\") pod \"etcd-operator-b45778765-dvw6x\" (UID: \"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.654836 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.662730 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4lw\" (UniqueName: \"kubernetes.io/projected/eaa6a825-72b4-4544-9e19-5af6b2c7648e-kube-api-access-vr4lw\") pod \"controller-manager-879f6c89f-tx86n\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.663490 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.674849 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.684189 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-748sw\" (UniqueName: \"kubernetes.io/projected/eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b-kube-api-access-748sw\") pod \"openshift-controller-manager-operator-756b6f6bc6-shn8j\" (UID: \"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.697160 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw"] Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.699181 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6pb\" (UniqueName: \"kubernetes.io/projected/a47bc793-deb1-42d1-9759-42c79b7ef053-kube-api-access-mj6pb\") pod \"cluster-image-registry-operator-dc59b4c8b-gqppb\" (UID: \"a47bc793-deb1-42d1-9759-42c79b7ef053\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.710134 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.715712 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.718078 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blgwk\" (UniqueName: \"kubernetes.io/projected/4aa62668-b8d2-4c84-a890-c91f79aae6e6-kube-api-access-blgwk\") pod \"machine-config-server-2kghk\" (UID: \"4aa62668-b8d2-4c84-a890-c91f79aae6e6\") " pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.741901 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhnwc\" (UniqueName: \"kubernetes.io/projected/083e169f-d8ba-4454-a2e4-84587ae7551c-kube-api-access-xhnwc\") pod \"authentication-operator-69f744f599-rsgcf\" (UID: \"083e169f-d8ba-4454-a2e4-84587ae7551c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.749584 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.764712 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.765762 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2kghk" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.784301 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.785378 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.802103 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4"] Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.804574 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.816794 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.823872 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.825168 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.843608 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.858379 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.863377 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.908831 4687 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.974856 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5"] Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.986583 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.987908 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-trusted-ca\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.987937 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63d9a180-a0d1-474e-a850-9a4235c5ac62-trusted-ca\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.987958 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9292d86c-b9c1-4a63-a766-c25874ffa2f5-config\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.987977 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988192 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e899d87a-f034-4436-8409-ca04178918b7-config-volume\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988223 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4beb3d39-9d4e-4964-9567-67396e456053-serving-cert\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988242 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988337 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988388 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6278\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-kube-api-access-v6278\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988410 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-trusted-ca-bundle\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988470 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-oauth-serving-cert\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.988942 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7461d892-4781-495c-b78f-5fe375ed4f44-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-494jw\" (UID: \"7461d892-4781-495c-b78f-5fe375ed4f44\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.989052 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f77b68ae-c1dd-481b-a831-d4698d8f44a0-serving-cert\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.989143 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-serving-cert\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:07 crc kubenswrapper[4687]: I0228 09:05:07.989399 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/100b328c-d3fd-4a0f-82e5-428f29240fc4-signing-key\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.989709 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.989749 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58h2p\" (UniqueName: \"kubernetes.io/projected/7461d892-4781-495c-b78f-5fe375ed4f44-kube-api-access-58h2p\") pod \"control-plane-machine-set-operator-78cbb6b69f-494jw\" (UID: \"7461d892-4781-495c-b78f-5fe375ed4f44\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.989793 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-policies\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.989825 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.989891 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:07.990127 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:08.4901144 +0000 UTC m=+100.180683737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990352 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990373 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990394 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-audit\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990413 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990430 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a30ffde-a939-4553-9c76-62164e19d8c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qns64\" (UID: \"8a30ffde-a939-4553-9c76-62164e19d8c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990447 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f77b68ae-c1dd-481b-a831-d4698d8f44a0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990466 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-bound-sa-token\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990482 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990514 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990535 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjss\" (UniqueName: \"kubernetes.io/projected/a8601fc9-0325-4c09-951c-afdda2eb2e05-kube-api-access-6jjss\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990555 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990646 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8601fc9-0325-4c09-951c-afdda2eb2e05-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990677 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-etcd-serving-ca\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990740 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29j69\" (UniqueName: \"kubernetes.io/projected/e899d87a-f034-4436-8409-ca04178918b7-kube-api-access-29j69\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990765 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990794 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-oauth-config\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990820 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-tls\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990844 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990867 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6ec553c-b9e9-4c6e-a1d1-6d730702968f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w7jmh\" (UID: \"a6ec553c-b9e9-4c6e-a1d1-6d730702968f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990900 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg7br\" (UniqueName: \"kubernetes.io/projected/f77b68ae-c1dd-481b-a831-d4698d8f44a0-kube-api-access-dg7br\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990917 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-serving-cert\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990933 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-config\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990955 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d9a180-a0d1-474e-a850-9a4235c5ac62-metrics-tls\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.990994 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-encryption-config\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991044 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-etcd-client\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991075 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlxb\" (UniqueName: \"kubernetes.io/projected/36a32d28-84e1-4c44-b2e5-546c8a1c8853-kube-api-access-pdlxb\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991115 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnc48\" (UniqueName: \"kubernetes.io/projected/8a30ffde-a939-4553-9c76-62164e19d8c6-kube-api-access-jnc48\") pod \"multus-admission-controller-857f4d67dd-qns64\" (UID: \"8a30ffde-a939-4553-9c76-62164e19d8c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991156 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e899d87a-f034-4436-8409-ca04178918b7-secret-volume\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991173 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-config\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991205 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-certificates\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991234 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991253 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll94z\" (UniqueName: \"kubernetes.io/projected/4beb3d39-9d4e-4964-9567-67396e456053-kube-api-access-ll94z\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991267 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974wt\" (UniqueName: \"kubernetes.io/projected/63d9a180-a0d1-474e-a850-9a4235c5ac62-kube-api-access-974wt\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991365 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8601fc9-0325-4c09-951c-afdda2eb2e05-proxy-tls\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991441 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/100b328c-d3fd-4a0f-82e5-428f29240fc4-signing-cabundle\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991459 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-service-ca\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991475 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6fj2\" (UniqueName: \"kubernetes.io/projected/a6ec553c-b9e9-4c6e-a1d1-6d730702968f-kube-api-access-m6fj2\") pod \"cluster-samples-operator-665b6dd947-w7jmh\" (UID: \"a6ec553c-b9e9-4c6e-a1d1-6d730702968f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991491 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwr7x\" (UniqueName: \"kubernetes.io/projected/4aa07587-0d38-4e29-92ef-c6957b5526a8-kube-api-access-bwr7x\") pod \"downloads-7954f5f757-8vhfl\" (UID: \"4aa07587-0d38-4e29-92ef-c6957b5526a8\") " pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991512 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-audit-dir\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991528 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28j9\" (UniqueName: \"kubernetes.io/projected/09d34ddd-da09-46e8-a9d5-5f395dbe8625-kube-api-access-q28j9\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991567 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4beb3d39-9d4e-4964-9567-67396e456053-config\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991583 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-image-import-ca\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991596 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-dir\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991613 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991634 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-node-pullsecrets\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991649 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554cb\" (UniqueName: \"kubernetes.io/projected/100b328c-d3fd-4a0f-82e5-428f29240fc4-kube-api-access-554cb\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991670 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbwps\" (UniqueName: \"kubernetes.io/projected/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-kube-api-access-kbwps\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991684 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pppwd\" (UniqueName: \"kubernetes.io/projected/96e679f2-11c5-4ade-abc4-56a7b85a5668-kube-api-access-pppwd\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991702 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63d9a180-a0d1-474e-a850-9a4235c5ac62-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:07.991716 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.011749 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.016041 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.037393 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" event={"ID":"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a","Type":"ContainerStarted","Data":"51927f63a7b36614c571527a35951eb1dc97fddfb51e4ed366d0ce2ba562ecd8"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.040229 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5grmr"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.041766 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" event={"ID":"04e92ab6-5602-4d97-9e70-f95ff9769a79","Type":"ContainerStarted","Data":"6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.041821 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" event={"ID":"04e92ab6-5602-4d97-9e70-f95ff9769a79","Type":"ContainerStarted","Data":"7499533b994e92ae0b8da7ad394eb713d8de2b517c8f68351166a51b6cdd5038"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.042780 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.046329 4687 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4f5xw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.046431 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" podUID="04e92ab6-5602-4d97-9e70-f95ff9769a79" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.057993 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.064553 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.065558 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6df9f" event={"ID":"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1","Type":"ContainerStarted","Data":"3d18f83465c94a5e805777cd98712cc375ce1ff026f5efb72ea074c40c001f8c"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.065581 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6df9f" event={"ID":"4553b1b1-a202-4e03-8d3a-bdb2eb6042d1","Type":"ContainerStarted","Data":"4c9bc4ff3e6d674e83b2d94dc1347eae6d3c77b2aa8386b1568314538b72ae98"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.066087 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.066979 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tx86n"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.069330 4687 patch_prober.go:28] interesting pod/console-operator-58897d9998-6df9f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.069374 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6df9f" podUID="4553b1b1-a202-4e03-8d3a-bdb2eb6042d1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.070663 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" event={"ID":"32938de1-4583-4038-8ced-e6bc1327911a","Type":"ContainerStarted","Data":"d62688499e7f0d95a008e55c832adce8868ed47b8fc661613fc5a646fe48c724"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.070684 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" event={"ID":"32938de1-4583-4038-8ced-e6bc1327911a","Type":"ContainerStarted","Data":"60c83f773ee0344632227b6cc947e16170b5f172b3182e0abd548edf10807c52"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.070701 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" event={"ID":"32938de1-4583-4038-8ced-e6bc1327911a","Type":"ContainerStarted","Data":"0bdf22a53167762f00af78a0ece946ee01fc0d4b71d98a8177199e2be62c9acf"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.071650 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" event={"ID":"9c6840e9-1b32-4e33-aa8c-31285246df48","Type":"ContainerStarted","Data":"4f3a9d8a79853513e8ff15356df7e93846ff858545db03522049303ed520962c"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.077298 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2kghk" event={"ID":"4aa62668-b8d2-4c84-a890-c91f79aae6e6","Type":"ContainerStarted","Data":"c0df61f7d2b51a6acf63da87ddd9dbeee6d2416aa065487a4d807b0e24eeb8f2"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.077348 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2kghk" event={"ID":"4aa62668-b8d2-4c84-a890-c91f79aae6e6","Type":"ContainerStarted","Data":"b990144170eec57361f4de8a45049a6a6847fc2960e23e63ca305004b3108135"} Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.083850 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" event={"ID":"3b45242a-b238-4814-b6fa-f22a62c5907f","Type":"ContainerStarted","Data":"6525799c5f3fba6dced15c54186b4f6d7835988f3f759f95dec8e25f4c6cb802"} Feb 28 09:05:08 crc kubenswrapper[4687]: W0228 09:05:08.087449 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb51fd3a_6467_4fcd_b3c0_6e8efa30aa2b.slice/crio-a6ec41d243cccf9901467b8716d4664cf8d3dd41c4a4257ae8f4d609722e7f5e WatchSource:0}: Error finding container a6ec41d243cccf9901467b8716d4664cf8d3dd41c4a4257ae8f4d609722e7f5e: Status 404 returned error can't find the container with id a6ec41d243cccf9901467b8716d4664cf8d3dd41c4a4257ae8f4d609722e7f5e Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.092310 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093005 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63d9a180-a0d1-474e-a850-9a4235c5ac62-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093054 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093112 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-metrics-certs\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.093169 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:08.593145389 +0000 UTC m=+100.283714727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093259 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-trusted-ca\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093316 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63d9a180-a0d1-474e-a850-9a4235c5ac62-trusted-ca\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093336 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9292d86c-b9c1-4a63-a766-c25874ffa2f5-config\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093363 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-mountpoint-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093383 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093401 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbbw\" (UniqueName: \"kubernetes.io/projected/664a84bd-b59d-4f25-824f-12b593193cd2-kube-api-access-vgbbw\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093432 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e899d87a-f034-4436-8409-ca04178918b7-config-volume\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093475 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4beb3d39-9d4e-4964-9567-67396e456053-serving-cert\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093491 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093519 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093539 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0596eca-6aad-4812-8c5c-06c0ab0ae911-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mhzwl\" (UID: \"a0596eca-6aad-4812-8c5c-06c0ab0ae911\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093610 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6278\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-kube-api-access-v6278\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093655 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-trusted-ca-bundle\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093677 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-oauth-serving-cert\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093783 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7461d892-4781-495c-b78f-5fe375ed4f44-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-494jw\" (UID: \"7461d892-4781-495c-b78f-5fe375ed4f44\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.093926 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f77b68ae-c1dd-481b-a831-d4698d8f44a0-serving-cert\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.094194 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-serving-cert\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.094267 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/100b328c-d3fd-4a0f-82e5-428f29240fc4-signing-key\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.094423 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9fd\" (UniqueName: \"kubernetes.io/projected/e00474ba-3061-4d3c-8880-05e4d50d82ae-kube-api-access-6q9fd\") pod \"migrator-59844c95c7-hhlxz\" (UID: \"e00474ba-3061-4d3c-8880-05e4d50d82ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.094492 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.095642 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e899d87a-f034-4436-8409-ca04178918b7-config-volume\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.096256 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9292d86c-b9c1-4a63-a766-c25874ffa2f5-config\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.096353 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63d9a180-a0d1-474e-a850-9a4235c5ac62-trusted-ca\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.097715 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-trusted-ca\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.097900 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.097992 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.098427 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58h2p\" (UniqueName: \"kubernetes.io/projected/7461d892-4781-495c-b78f-5fe375ed4f44-kube-api-access-58h2p\") pod \"control-plane-machine-set-operator-78cbb6b69f-494jw\" (UID: \"7461d892-4781-495c-b78f-5fe375ed4f44\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.098476 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75b60b2e-cda7-4a73-bf67-117363db768a-metrics-tls\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.098565 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-oauth-serving-cert\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.098899 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-trusted-ca-bundle\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.102998 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105091 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-policies\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105151 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9292d86c-b9c1-4a63-a766-c25874ffa2f5-images\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105205 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105228 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105301 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b1fe7b-e164-4f79-835b-0cc128a680eb-service-ca-bundle\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105328 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chwq9\" (UniqueName: \"kubernetes.io/projected/36c2be94-93ed-4fba-9bcd-e0ebe892909e-kube-api-access-chwq9\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.105345 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-plugins-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106191 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106218 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106258 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-audit\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106287 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-serving-cert\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106321 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106338 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a30ffde-a939-4553-9c76-62164e19d8c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qns64\" (UID: \"8a30ffde-a939-4553-9c76-62164e19d8c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106356 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f77b68ae-c1dd-481b-a831-d4698d8f44a0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106382 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-bound-sa-token\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106398 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106447 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106463 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dbb3aa8-4352-49d1-b693-10281b8e4fac-cert\") pod \"ingress-canary-47grc\" (UID: \"3dbb3aa8-4352-49d1-b693-10281b8e4fac\") " pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106479 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-socket-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106494 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-etcd-client\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106522 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjss\" (UniqueName: \"kubernetes.io/projected/a8601fc9-0325-4c09-951c-afdda2eb2e05-kube-api-access-6jjss\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106537 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-audit-policies\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106555 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aad2142e-fd55-41f2-96ca-f43b0362c071-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106571 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-stats-auth\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106606 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m67c\" (UniqueName: \"kubernetes.io/projected/3dbb3aa8-4352-49d1-b693-10281b8e4fac-kube-api-access-8m67c\") pod \"ingress-canary-47grc\" (UID: \"3dbb3aa8-4352-49d1-b693-10281b8e4fac\") " pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106642 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106672 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8601fc9-0325-4c09-951c-afdda2eb2e05-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106689 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-etcd-serving-ca\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106708 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccqs\" (UniqueName: \"kubernetes.io/projected/75b60b2e-cda7-4a73-bf67-117363db768a-kube-api-access-9ccqs\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106752 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29j69\" (UniqueName: \"kubernetes.io/projected/e899d87a-f034-4436-8409-ca04178918b7-kube-api-access-29j69\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106779 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106796 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-oauth-config\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106814 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9292d86c-b9c1-4a63-a766-c25874ffa2f5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106831 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad2142e-fd55-41f2-96ca-f43b0362c071-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106861 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-tls\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106879 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106896 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/36c2be94-93ed-4fba-9bcd-e0ebe892909e-srv-cert\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106923 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6ec553c-b9e9-4c6e-a1d1-6d730702968f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w7jmh\" (UID: \"a6ec553c-b9e9-4c6e-a1d1-6d730702968f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106938 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c633c27-c00d-4436-8b95-c327bcf08a0c-audit-dir\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106981 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg7br\" (UniqueName: \"kubernetes.io/projected/f77b68ae-c1dd-481b-a831-d4698d8f44a0-kube-api-access-dg7br\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.106998 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-serving-cert\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107014 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-config\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.107076 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:08.607060379 +0000 UTC m=+100.297629716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107103 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d9a180-a0d1-474e-a850-9a4235c5ac62-metrics-tls\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107148 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-encryption-config\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107181 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-etcd-client\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107251 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlxb\" (UniqueName: \"kubernetes.io/projected/36a32d28-84e1-4c44-b2e5-546c8a1c8853-kube-api-access-pdlxb\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107426 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnc48\" (UniqueName: \"kubernetes.io/projected/8a30ffde-a939-4553-9c76-62164e19d8c6-kube-api-access-jnc48\") pod \"multus-admission-controller-857f4d67dd-qns64\" (UID: \"8a30ffde-a939-4553-9c76-62164e19d8c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107451 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn8k\" (UniqueName: \"kubernetes.io/projected/a0596eca-6aad-4812-8c5c-06c0ab0ae911-kube-api-access-hsn8k\") pod \"package-server-manager-789f6589d5-mhzwl\" (UID: \"a0596eca-6aad-4812-8c5c-06c0ab0ae911\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107517 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e899d87a-f034-4436-8409-ca04178918b7-secret-volume\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107555 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-config\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.107596 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-encryption-config\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.109218 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-certificates\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.109244 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mstnh\" (UniqueName: \"kubernetes.io/projected/55b1fe7b-e164-4f79-835b-0cc128a680eb-kube-api-access-mstnh\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.109369 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.109405 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll94z\" (UniqueName: \"kubernetes.io/projected/4beb3d39-9d4e-4964-9567-67396e456053-kube-api-access-ll94z\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.109530 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974wt\" (UniqueName: \"kubernetes.io/projected/63d9a180-a0d1-474e-a850-9a4235c5ac62-kube-api-access-974wt\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.110337 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.110696 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64p8l\" (UniqueName: \"kubernetes.io/projected/9292d86c-b9c1-4a63-a766-c25874ffa2f5-kube-api-access-64p8l\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.110815 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-policies\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111451 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8601fc9-0325-4c09-951c-afdda2eb2e05-proxy-tls\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111480 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2lz\" (UniqueName: \"kubernetes.io/projected/9c633c27-c00d-4436-8b95-c327bcf08a0c-kube-api-access-zv2lz\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111551 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/100b328c-d3fd-4a0f-82e5-428f29240fc4-signing-cabundle\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111570 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-service-ca\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111703 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6fj2\" (UniqueName: \"kubernetes.io/projected/a6ec553c-b9e9-4c6e-a1d1-6d730702968f-kube-api-access-m6fj2\") pod \"cluster-samples-operator-665b6dd947-w7jmh\" (UID: \"a6ec553c-b9e9-4c6e-a1d1-6d730702968f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111751 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwr7x\" (UniqueName: \"kubernetes.io/projected/4aa07587-0d38-4e29-92ef-c6957b5526a8-kube-api-access-bwr7x\") pod \"downloads-7954f5f757-8vhfl\" (UID: \"4aa07587-0d38-4e29-92ef-c6957b5526a8\") " pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111779 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-registration-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111855 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-audit-dir\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111874 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q28j9\" (UniqueName: \"kubernetes.io/projected/09d34ddd-da09-46e8-a9d5-5f395dbe8625-kube-api-access-q28j9\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111894 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-csi-data-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111925 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4beb3d39-9d4e-4964-9567-67396e456053-config\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111944 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-image-import-ca\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111961 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-dir\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111978 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.111996 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ztpn\" (UniqueName: \"kubernetes.io/projected/aad2142e-fd55-41f2-96ca-f43b0362c071-kube-api-access-9ztpn\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.112747 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75b60b2e-cda7-4a73-bf67-117363db768a-config-volume\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.112810 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-node-pullsecrets\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.112829 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-554cb\" (UniqueName: \"kubernetes.io/projected/100b328c-d3fd-4a0f-82e5-428f29240fc4-kube-api-access-554cb\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.113042 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-etcd-serving-ca\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.113014 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.113399 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-default-certificate\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.113447 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbwps\" (UniqueName: \"kubernetes.io/projected/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-kube-api-access-kbwps\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.113946 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.114313 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-config\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.114455 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pppwd\" (UniqueName: \"kubernetes.io/projected/96e679f2-11c5-4ade-abc4-56a7b85a5668-kube-api-access-pppwd\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.112629 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.115200 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.115307 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.115849 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4beb3d39-9d4e-4964-9567-67396e456053-serving-cert\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.116735 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/36c2be94-93ed-4fba-9bcd-e0ebe892909e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.117367 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.120868 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f77b68ae-c1dd-481b-a831-d4698d8f44a0-serving-cert\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.121112 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.121789 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-ca-trust-extracted\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.122176 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/100b328c-d3fd-4a0f-82e5-428f29240fc4-signing-key\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.122320 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rsgcf"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.125380 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f77b68ae-c1dd-481b-a831-d4698d8f44a0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.125788 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8601fc9-0325-4c09-951c-afdda2eb2e05-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.126943 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.127946 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-certificates\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.128090 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.128555 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dvw6x"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.128957 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-audit\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.129137 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-node-pullsecrets\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.129498 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-audit-dir\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.130102 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7461d892-4781-495c-b78f-5fe375ed4f44-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-494jw\" (UID: \"7461d892-4781-495c-b78f-5fe375ed4f44\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.130349 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-dir\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.130965 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4beb3d39-9d4e-4964-9567-67396e456053-config\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.131067 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/100b328c-d3fd-4a0f-82e5-428f29240fc4-signing-cabundle\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.131901 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-image-import-ca\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.135306 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-service-ca\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.135853 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.136445 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e899d87a-f034-4436-8409-ca04178918b7-secret-volume\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.137085 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d9a180-a0d1-474e-a850-9a4235c5ac62-metrics-tls\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.138921 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-serving-cert\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.141178 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.141703 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.142226 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-config\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.142379 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.142582 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-tls\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.142684 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-encryption-config\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.144328 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a30ffde-a939-4553-9c76-62164e19d8c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qns64\" (UID: \"8a30ffde-a939-4553-9c76-62164e19d8c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.144375 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-etcd-client\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.144720 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-installation-pull-secrets\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.145168 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8601fc9-0325-4c09-951c-afdda2eb2e05-proxy-tls\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.145230 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6ec553c-b9e9-4c6e-a1d1-6d730702968f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w7jmh\" (UID: \"a6ec553c-b9e9-4c6e-a1d1-6d730702968f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.146742 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6278\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-kube-api-access-v6278\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.147207 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-serving-cert\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.147669 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-oauth-config\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.148334 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.161991 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63d9a180-a0d1-474e-a850-9a4235c5ac62-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.185411 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnc48\" (UniqueName: \"kubernetes.io/projected/8a30ffde-a939-4553-9c76-62164e19d8c6-kube-api-access-jnc48\") pod \"multus-admission-controller-857f4d67dd-qns64\" (UID: \"8a30ffde-a939-4553-9c76-62164e19d8c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.206009 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58h2p\" (UniqueName: \"kubernetes.io/projected/7461d892-4781-495c-b78f-5fe375ed4f44-kube-api-access-58h2p\") pod \"control-plane-machine-set-operator-78cbb6b69f-494jw\" (UID: \"7461d892-4781-495c-b78f-5fe375ed4f44\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221435 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221738 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad2142e-fd55-41f2-96ca-f43b0362c071-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221784 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9292d86c-b9c1-4a63-a766-c25874ffa2f5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221807 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/36c2be94-93ed-4fba-9bcd-e0ebe892909e-srv-cert\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221825 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c633c27-c00d-4436-8b95-c327bcf08a0c-audit-dir\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221865 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn8k\" (UniqueName: \"kubernetes.io/projected/a0596eca-6aad-4812-8c5c-06c0ab0ae911-kube-api-access-hsn8k\") pod \"package-server-manager-789f6589d5-mhzwl\" (UID: \"a0596eca-6aad-4812-8c5c-06c0ab0ae911\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221889 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-encryption-config\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221906 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mstnh\" (UniqueName: \"kubernetes.io/projected/55b1fe7b-e164-4f79-835b-0cc128a680eb-kube-api-access-mstnh\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221922 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64p8l\" (UniqueName: \"kubernetes.io/projected/9292d86c-b9c1-4a63-a766-c25874ffa2f5-kube-api-access-64p8l\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221956 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2lz\" (UniqueName: \"kubernetes.io/projected/9c633c27-c00d-4436-8b95-c327bcf08a0c-kube-api-access-zv2lz\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.221986 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-registration-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222010 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-csi-data-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222040 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ztpn\" (UniqueName: \"kubernetes.io/projected/aad2142e-fd55-41f2-96ca-f43b0362c071-kube-api-access-9ztpn\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222058 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75b60b2e-cda7-4a73-bf67-117363db768a-config-volume\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222080 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222096 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-default-certificate\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222112 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/36c2be94-93ed-4fba-9bcd-e0ebe892909e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222145 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-metrics-certs\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222162 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-mountpoint-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222178 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbbw\" (UniqueName: \"kubernetes.io/projected/664a84bd-b59d-4f25-824f-12b593193cd2-kube-api-access-vgbbw\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222203 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0596eca-6aad-4812-8c5c-06c0ab0ae911-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mhzwl\" (UID: \"a0596eca-6aad-4812-8c5c-06c0ab0ae911\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222226 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9fd\" (UniqueName: \"kubernetes.io/projected/e00474ba-3061-4d3c-8880-05e4d50d82ae-kube-api-access-6q9fd\") pod \"migrator-59844c95c7-hhlxz\" (UID: \"e00474ba-3061-4d3c-8880-05e4d50d82ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222244 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222260 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75b60b2e-cda7-4a73-bf67-117363db768a-metrics-tls\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222277 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9292d86c-b9c1-4a63-a766-c25874ffa2f5-images\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222301 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b1fe7b-e164-4f79-835b-0cc128a680eb-service-ca-bundle\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222318 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chwq9\" (UniqueName: \"kubernetes.io/projected/36c2be94-93ed-4fba-9bcd-e0ebe892909e-kube-api-access-chwq9\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222331 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-plugins-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222359 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-serving-cert\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222393 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dbb3aa8-4352-49d1-b693-10281b8e4fac-cert\") pod \"ingress-canary-47grc\" (UID: \"3dbb3aa8-4352-49d1-b693-10281b8e4fac\") " pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222410 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-socket-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222424 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-etcd-client\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222446 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-audit-policies\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222460 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aad2142e-fd55-41f2-96ca-f43b0362c071-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222476 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-stats-auth\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222493 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m67c\" (UniqueName: \"kubernetes.io/projected/3dbb3aa8-4352-49d1-b693-10281b8e4fac-kube-api-access-8m67c\") pod \"ingress-canary-47grc\" (UID: \"3dbb3aa8-4352-49d1-b693-10281b8e4fac\") " pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222520 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccqs\" (UniqueName: \"kubernetes.io/projected/75b60b2e-cda7-4a73-bf67-117363db768a-kube-api-access-9ccqs\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.222999 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq"] Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.223086 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-mountpoint-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.225964 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjss\" (UniqueName: \"kubernetes.io/projected/a8601fc9-0325-4c09-951c-afdda2eb2e05-kube-api-access-6jjss\") pod \"machine-config-controller-84d6567774-fb24z\" (UID: \"a8601fc9-0325-4c09-951c-afdda2eb2e05\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.223590 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad2142e-fd55-41f2-96ca-f43b0362c071-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.226201 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-audit-policies\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.227014 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-socket-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.227169 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c633c27-c00d-4436-8b95-c327bcf08a0c-audit-dir\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.223202 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-plugins-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.227707 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-csi-data-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.227708 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/664a84bd-b59d-4f25-824f-12b593193cd2-registration-dir\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.228189 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9292d86c-b9c1-4a63-a766-c25874ffa2f5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.228384 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b1fe7b-e164-4f79-835b-0cc128a680eb-service-ca-bundle\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.228963 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.229909 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-serving-cert\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.230377 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0596eca-6aad-4812-8c5c-06c0ab0ae911-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mhzwl\" (UID: \"a0596eca-6aad-4812-8c5c-06c0ab0ae911\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.230971 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c633c27-c00d-4436-8b95-c327bcf08a0c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.231159 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:08.722693288 +0000 UTC m=+100.413262626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.231935 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9292d86c-b9c1-4a63-a766-c25874ffa2f5-images\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.234803 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75b60b2e-cda7-4a73-bf67-117363db768a-config-volume\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.236501 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-default-certificate\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.237065 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/75b60b2e-cda7-4a73-bf67-117363db768a-metrics-tls\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.237184 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/36c2be94-93ed-4fba-9bcd-e0ebe892909e-srv-cert\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.238228 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aad2142e-fd55-41f2-96ca-f43b0362c071-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.238337 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.239141 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/36c2be94-93ed-4fba-9bcd-e0ebe892909e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.239550 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-encryption-config\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.239976 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c633c27-c00d-4436-8b95-c327bcf08a0c-etcd-client\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.241878 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dbb3aa8-4352-49d1-b693-10281b8e4fac-cert\") pod \"ingress-canary-47grc\" (UID: \"3dbb3aa8-4352-49d1-b693-10281b8e4fac\") " pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.242363 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-metrics-certs\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.243006 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/55b1fe7b-e164-4f79-835b-0cc128a680eb-stats-auth\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.244726 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.259830 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbwps\" (UniqueName: \"kubernetes.io/projected/0654f4f1-e605-4c0a-9e24-90b1ce4fd440-kube-api-access-kbwps\") pod \"apiserver-76f77b778f-bqdqx\" (UID: \"0654f4f1-e605-4c0a-9e24-90b1ce4fd440\") " pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.296304 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pppwd\" (UniqueName: \"kubernetes.io/projected/96e679f2-11c5-4ade-abc4-56a7b85a5668-kube-api-access-pppwd\") pod \"console-f9d7485db-4m8kh\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.300526 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" Feb 28 09:05:08 crc kubenswrapper[4687]: W0228 09:05:08.314615 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3d794dc_474f_4572_8227_60bc4a41c69e.slice/crio-ae5c0c18ab03cf122ee8d5a4929678eab62bf3db61e58fb9d60384e99d80a6fd WatchSource:0}: Error finding container ae5c0c18ab03cf122ee8d5a4929678eab62bf3db61e58fb9d60384e99d80a6fd: Status 404 returned error can't find the container with id ae5c0c18ab03cf122ee8d5a4929678eab62bf3db61e58fb9d60384e99d80a6fd Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.314971 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-bound-sa-token\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.324695 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.325214 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:08.825195711 +0000 UTC m=+100.515765049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.326690 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlxb\" (UniqueName: \"kubernetes.io/projected/36a32d28-84e1-4c44-b2e5-546c8a1c8853-kube-api-access-pdlxb\") pod \"marketplace-operator-79b997595-5xt25\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.339450 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg7br\" (UniqueName: \"kubernetes.io/projected/f77b68ae-c1dd-481b-a831-d4698d8f44a0-kube-api-access-dg7br\") pod \"openshift-config-operator-7777fb866f-w4s2q\" (UID: \"f77b68ae-c1dd-481b-a831-d4698d8f44a0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.357627 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29j69\" (UniqueName: \"kubernetes.io/projected/e899d87a-f034-4436-8409-ca04178918b7-kube-api-access-29j69\") pod \"collect-profiles-29537820-r5c29\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.366136 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.372262 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.404627 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-554cb\" (UniqueName: \"kubernetes.io/projected/100b328c-d3fd-4a0f-82e5-428f29240fc4-kube-api-access-554cb\") pod \"service-ca-9c57cc56f-q26sq\" (UID: \"100b328c-d3fd-4a0f-82e5-428f29240fc4\") " pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.405427 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.409328 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll94z\" (UniqueName: \"kubernetes.io/projected/4beb3d39-9d4e-4964-9567-67396e456053-kube-api-access-ll94z\") pod \"service-ca-operator-777779d784-bk6v7\" (UID: \"4beb3d39-9d4e-4964-9567-67396e456053\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.417520 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974wt\" (UniqueName: \"kubernetes.io/projected/63d9a180-a0d1-474e-a850-9a4235c5ac62-kube-api-access-974wt\") pod \"ingress-operator-5b745b69d9-vx4hz\" (UID: \"63d9a180-a0d1-474e-a850-9a4235c5ac62\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.425352 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.425664 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:08.925652447 +0000 UTC m=+100.616221784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.455541 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28j9\" (UniqueName: \"kubernetes.io/projected/09d34ddd-da09-46e8-a9d5-5f395dbe8625-kube-api-access-q28j9\") pod \"oauth-openshift-558db77b4-zhdhr\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.466914 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.478366 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6fj2\" (UniqueName: \"kubernetes.io/projected/a6ec553c-b9e9-4c6e-a1d1-6d730702968f-kube-api-access-m6fj2\") pod \"cluster-samples-operator-665b6dd947-w7jmh\" (UID: \"a6ec553c-b9e9-4c6e-a1d1-6d730702968f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.489124 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.500469 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwr7x\" (UniqueName: \"kubernetes.io/projected/4aa07587-0d38-4e29-92ef-c6957b5526a8-kube-api-access-bwr7x\") pod \"downloads-7954f5f757-8vhfl\" (UID: \"4aa07587-0d38-4e29-92ef-c6957b5526a8\") " pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.508181 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbbw\" (UniqueName: \"kubernetes.io/projected/664a84bd-b59d-4f25-824f-12b593193cd2-kube-api-access-vgbbw\") pod \"csi-hostpathplugin-26znf\" (UID: \"664a84bd-b59d-4f25-824f-12b593193cd2\") " pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.528101 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.528423 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.028411544 +0000 UTC m=+100.718980880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.554406 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m67c\" (UniqueName: \"kubernetes.io/projected/3dbb3aa8-4352-49d1-b693-10281b8e4fac-kube-api-access-8m67c\") pod \"ingress-canary-47grc\" (UID: \"3dbb3aa8-4352-49d1-b693-10281b8e4fac\") " pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.570532 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccqs\" (UniqueName: \"kubernetes.io/projected/75b60b2e-cda7-4a73-bf67-117363db768a-kube-api-access-9ccqs\") pod \"dns-default-kvzpk\" (UID: \"75b60b2e-cda7-4a73-bf67-117363db768a\") " pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.605081 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn8k\" (UniqueName: \"kubernetes.io/projected/a0596eca-6aad-4812-8c5c-06c0ab0ae911-kube-api-access-hsn8k\") pod \"package-server-manager-789f6589d5-mhzwl\" (UID: \"a0596eca-6aad-4812-8c5c-06c0ab0ae911\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.628001 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.628532 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.628760 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ztpn\" (UniqueName: \"kubernetes.io/projected/aad2142e-fd55-41f2-96ca-f43b0362c071-kube-api-access-9ztpn\") pod \"openshift-apiserver-operator-796bbdcf4f-kz4nd\" (UID: \"aad2142e-fd55-41f2-96ca-f43b0362c071\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.628871 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.628953 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.128927421 +0000 UTC m=+100.819496758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.629195 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.629602 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.129593444 +0000 UTC m=+100.820162781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.635749 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.637896 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9fd\" (UniqueName: \"kubernetes.io/projected/e00474ba-3061-4d3c-8880-05e4d50d82ae-kube-api-access-6q9fd\") pod \"migrator-59844c95c7-hhlxz\" (UID: \"e00474ba-3061-4d3c-8880-05e4d50d82ae\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.642158 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.657823 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mstnh\" (UniqueName: \"kubernetes.io/projected/55b1fe7b-e164-4f79-835b-0cc128a680eb-kube-api-access-mstnh\") pod \"router-default-5444994796-zrtwj\" (UID: \"55b1fe7b-e164-4f79-835b-0cc128a680eb\") " pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.660923 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64p8l\" (UniqueName: \"kubernetes.io/projected/9292d86c-b9c1-4a63-a766-c25874ffa2f5-kube-api-access-64p8l\") pod \"machine-api-operator-5694c8668f-9thbt\" (UID: \"9292d86c-b9c1-4a63-a766-c25874ffa2f5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.686552 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-26znf" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.689071 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2lz\" (UniqueName: \"kubernetes.io/projected/9c633c27-c00d-4436-8b95-c327bcf08a0c-kube-api-access-zv2lz\") pod \"apiserver-7bbb656c7d-s76rx\" (UID: \"9c633c27-c00d-4436-8b95-c327bcf08a0c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.691484 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39382: no serving certificate available for the kubelet" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.697570 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-47grc" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.701972 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.709834 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.711277 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chwq9\" (UniqueName: \"kubernetes.io/projected/36c2be94-93ed-4fba-9bcd-e0ebe892909e-kube-api-access-chwq9\") pod \"olm-operator-6b444d44fb-9b6d5\" (UID: \"36c2be94-93ed-4fba-9bcd-e0ebe892909e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.715939 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39390: no serving certificate available for the kubelet" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.731144 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.731407 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.231389048 +0000 UTC m=+100.921958385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.731905 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.732486 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.232472887 +0000 UTC m=+100.923042224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.732987 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.739001 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.740741 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39404: no serving certificate available for the kubelet" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.744958 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.772616 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.798926 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39416: no serving certificate available for the kubelet" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.804394 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.816371 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.826809 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.832864 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.833410 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.333381934 +0000 UTC m=+101.023951270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.833511 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.833955 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.333947587 +0000 UTC m=+101.024516924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.908075 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.921730 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39428: no serving certificate available for the kubelet" Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.935194 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:08 crc kubenswrapper[4687]: E0228 09:05:08.935751 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.43573196 +0000 UTC m=+101.126301297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:08 crc kubenswrapper[4687]: I0228 09:05:08.994402 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39434: no serving certificate available for the kubelet" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.037207 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.038956 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.538940654 +0000 UTC m=+101.229509991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.101537 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39448: no serving certificate available for the kubelet" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.111521 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" event={"ID":"eaa6a825-72b4-4544-9e19-5af6b2c7648e","Type":"ContainerStarted","Data":"6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.111570 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" event={"ID":"eaa6a825-72b4-4544-9e19-5af6b2c7648e","Type":"ContainerStarted","Data":"eec4d7ce43bbec53e53a04ee55088542da07736ba4409ac570aeeb07e55d9886"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.112087 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.115120 4687 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tx86n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.115152 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" podUID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.116014 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" event={"ID":"941eced7-a875-42d3-91a4-d36f770b30a6","Type":"ContainerStarted","Data":"fa24e3eafcdb7b38b476d73aa361bacdfd976dd9a7faedfc90254afae171250d"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.116051 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" event={"ID":"941eced7-a875-42d3-91a4-d36f770b30a6","Type":"ContainerStarted","Data":"2e98d44a4b91f13ee24d2df520da7a9bcb1f2252813839c14a481ba03ea0acee"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.140319 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.140665 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.640648031 +0000 UTC m=+101.331217368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.140966 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" event={"ID":"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b","Type":"ContainerStarted","Data":"3b0cb3c8fb31fcc15c428ef0d0801e44f03e0e10aeb129f11522fd29edf3871b"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.141002 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" event={"ID":"eb51fd3a-6467-4fcd-b3c0-6e8efa30aa2b","Type":"ContainerStarted","Data":"a6ec41d243cccf9901467b8716d4664cf8d3dd41c4a4257ae8f4d609722e7f5e"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.158279 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zrtwj" event={"ID":"55b1fe7b-e164-4f79-835b-0cc128a680eb","Type":"ContainerStarted","Data":"44571fc72e1104aca85a5c2bb7f1d117f9a4dc3735a15b5a4e8aaed89fe32f30"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.178685 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" event={"ID":"a47bc793-deb1-42d1-9759-42c79b7ef053","Type":"ContainerStarted","Data":"9aafb976db11654466bfd56280bcad11dd7f7db012e13dc83c4578add1cfbc63"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.178736 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" event={"ID":"a47bc793-deb1-42d1-9759-42c79b7ef053","Type":"ContainerStarted","Data":"e4742e51e31bfbba3070c8c3337b1c19dcecc597c1b7d779032c2b3da0827422"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.193298 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" event={"ID":"01c22693-ad9c-426f-8ae6-ac335c7cbca1","Type":"ContainerStarted","Data":"8cddfaf49c20627c33905fd084f5b9cc8cfd4baf5236d6c300429f9f7ea0a79a"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.193365 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" event={"ID":"01c22693-ad9c-426f-8ae6-ac335c7cbca1","Type":"ContainerStarted","Data":"618eb5661f9627988dc406478640976c3f1c3714e72f109055d805a5c7d43272"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.193378 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" event={"ID":"01c22693-ad9c-426f-8ae6-ac335c7cbca1","Type":"ContainerStarted","Data":"71c2c8b1fce44d2b1d0c3f6b89de57ed3db92aacf17eb4b8219e7a5eca9e980d"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.209386 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" event={"ID":"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0","Type":"ContainerStarted","Data":"78d443535e66816d33c4786944566ffbd478467f0ee3848c0c270131a7a71cd5"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.209418 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" event={"ID":"317c37b0-9eb9-40aa-b0f3-bae9d4cc4ca0","Type":"ContainerStarted","Data":"efb7137eb5fdeb221e6752818dcfd3a90b857f406e05eaa336046dc996d9d5d4"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.228171 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" event={"ID":"5aacd998-f8bd-49e1-8d54-a4775c7e1f83","Type":"ContainerStarted","Data":"051e80f3de3804c4e60f51e63e2ac926c5d5e4813452c24fe60ed98890fb659f"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.228214 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" event={"ID":"5aacd998-f8bd-49e1-8d54-a4775c7e1f83","Type":"ContainerStarted","Data":"191c3364a76246a3221a4c16df878f745e3797aeb0cabe1294aa5487a1e9fed5"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.228225 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" event={"ID":"5aacd998-f8bd-49e1-8d54-a4775c7e1f83","Type":"ContainerStarted","Data":"e5e4a3c1c934315ce73a7d5afefab4e47bf387dafd0575bcba8acc742bb6f8d0"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.242142 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.243541 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.74353003 +0000 UTC m=+101.434099367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.243591 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" event={"ID":"defaa3ff-8d11-49b1-a9d4-7f54a0650d0a","Type":"ContainerStarted","Data":"021b1dd1c67038bfa217c76126d20b9be4d320793bcd8790a2526be500ba8b74"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.256446 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" event={"ID":"083e169f-d8ba-4454-a2e4-84587ae7551c","Type":"ContainerStarted","Data":"27eb364f5b9235adf702831a2d05de5e27fbc50c2a581d625541718b04daecc3"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.256482 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" event={"ID":"083e169f-d8ba-4454-a2e4-84587ae7551c","Type":"ContainerStarted","Data":"56b2f12ea66c15a3aeebc95185c81ceefbea3e90353b191804cc74c081348825"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.264682 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" event={"ID":"4bfbdc6f-2078-4dee-b253-d7658f4e839d","Type":"ContainerStarted","Data":"4f598c4ce7555d8fb6b2d44cb1014a8ddd9da948fe6646e48cb11a1fbcc16693"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.264896 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" event={"ID":"4bfbdc6f-2078-4dee-b253-d7658f4e839d","Type":"ContainerStarted","Data":"7b85df916c56a6b34b8e8a86b46dd93f568ab4e92d4cea556f2d77165ce622ec"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.264912 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z"] Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.265625 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.274732 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" event={"ID":"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0","Type":"ContainerStarted","Data":"b4d28f9add11ef2c0c95899f8ef57532d8d1866eac074f79fa05a4fbca346be7"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.274815 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" event={"ID":"80136d7f-e0f8-4ff5-a22a-b5933f9e2cf0","Type":"ContainerStarted","Data":"6f6c0ba67d1ec4f612d302ca05e1ce8308b8f0a77b2e26702014a03e6dbac96a"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.278055 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.287911 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" event={"ID":"c3d794dc-474f-4572-8227-60bc4a41c69e","Type":"ContainerStarted","Data":"93cfa4dfa7eefd1ec47d4baa7242a4f8bcc63a41203815f13a1eb81b2e1b71c4"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.287940 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" event={"ID":"c3d794dc-474f-4572-8227-60bc4a41c69e","Type":"ContainerStarted","Data":"ae5c0c18ab03cf122ee8d5a4929678eab62bf3db61e58fb9d60384e99d80a6fd"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.324419 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" event={"ID":"9c6840e9-1b32-4e33-aa8c-31285246df48","Type":"ContainerStarted","Data":"32cb3a533130708b1323ea28729a4c9927618441c66d7ac48ac6b18c07d26dbd"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.339291 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" event={"ID":"3b45242a-b238-4814-b6fa-f22a62c5907f","Type":"ContainerStarted","Data":"e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6"} Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.343748 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.345252 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.84523326 +0000 UTC m=+101.535802597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.346802 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.353417 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw"] Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.361124 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qns64"] Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.402613 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bqdqx"] Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.448618 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.455885 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:09.955872623 +0000 UTC m=+101.646441961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.459266 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6df9f" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.460340 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.510408 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39462: no serving certificate available for the kubelet" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.514179 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.550355 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.550502 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.050475956 +0000 UTC m=+101.741045292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.550629 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.550908 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.050898751 +0000 UTC m=+101.741468088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: W0228 09:05:09.645705 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a30ffde_a939_4553_9c76_62164e19d8c6.slice/crio-b9b5fad257c2ffa63cdca484644cde5f2b07ddfb93fe03b096e93352295c9b7d WatchSource:0}: Error finding container b9b5fad257c2ffa63cdca484644cde5f2b07ddfb93fe03b096e93352295c9b7d: Status 404 returned error can't find the container with id b9b5fad257c2ffa63cdca484644cde5f2b07ddfb93fe03b096e93352295c9b7d Feb 28 09:05:09 crc kubenswrapper[4687]: W0228 09:05:09.645896 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0654f4f1_e605_4c0a_9e24_90b1ce4fd440.slice/crio-cb627879c8346a700ecbaf1da5dcb87e7da9cc5a05d5e6b81b148816c54453cf WatchSource:0}: Error finding container cb627879c8346a700ecbaf1da5dcb87e7da9cc5a05d5e6b81b148816c54453cf: Status 404 returned error can't find the container with id cb627879c8346a700ecbaf1da5dcb87e7da9cc5a05d5e6b81b148816c54453cf Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.651396 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.651673 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.151659909 +0000 UTC m=+101.842229246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.758247 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.758900 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.258888395 +0000 UTC m=+101.949457731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.760292 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-s2d57" podStartSLOduration=72.760282938 podStartE2EDuration="1m12.760282938s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.711077871 +0000 UTC m=+101.401647208" watchObservedRunningTime="2026-02-28 09:05:09.760282938 +0000 UTC m=+101.450852275" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.761330 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-stsx5" podStartSLOduration=72.761324949 podStartE2EDuration="1m12.761324949s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.75940118 +0000 UTC m=+101.449970527" watchObservedRunningTime="2026-02-28 09:05:09.761324949 +0000 UTC m=+101.451894286" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.819250 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" podStartSLOduration=72.819239543 podStartE2EDuration="1m12.819239543s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.789049993 +0000 UTC m=+101.479619330" watchObservedRunningTime="2026-02-28 09:05:09.819239543 +0000 UTC m=+101.509808880" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.860682 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.861132 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.361121883 +0000 UTC m=+102.051691220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.869009 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-shn8j" podStartSLOduration=72.869000326 podStartE2EDuration="1m12.869000326s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.821065498 +0000 UTC m=+101.511634835" watchObservedRunningTime="2026-02-28 09:05:09.869000326 +0000 UTC m=+101.559569663" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.924140 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dvw6x" podStartSLOduration=72.924120332 podStartE2EDuration="1m12.924120332s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.922555067 +0000 UTC m=+101.613124414" watchObservedRunningTime="2026-02-28 09:05:09.924120332 +0000 UTC m=+101.614689659" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.927592 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" podStartSLOduration=72.927573429 podStartE2EDuration="1m12.927573429s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.870135571 +0000 UTC m=+101.560704909" watchObservedRunningTime="2026-02-28 09:05:09.927573429 +0000 UTC m=+101.618142766" Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.965112 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:09 crc kubenswrapper[4687]: E0228 09:05:09.970630 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.465392264 +0000 UTC m=+102.155961601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:09 crc kubenswrapper[4687]: I0228 09:05:09.975392 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4j8df" podStartSLOduration=72.975370086 podStartE2EDuration="1m12.975370086s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:09.94830278 +0000 UTC m=+101.638872118" watchObservedRunningTime="2026-02-28 09:05:09.975370086 +0000 UTC m=+101.665939423" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.066952 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.067823 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.567796723 +0000 UTC m=+102.258366061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.106604 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6df9f" podStartSLOduration=73.106588068 podStartE2EDuration="1m13.106588068s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.104686389 +0000 UTC m=+101.795255727" watchObservedRunningTime="2026-02-28 09:05:10.106588068 +0000 UTC m=+101.797157394" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.120120 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7tgnq" podStartSLOduration=73.120104618 podStartE2EDuration="1m13.120104618s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.061475189 +0000 UTC m=+101.752044526" watchObservedRunningTime="2026-02-28 09:05:10.120104618 +0000 UTC m=+101.810673956" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.170221 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.170576 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.67056115 +0000 UTC m=+102.361130487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.185998 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" podStartSLOduration=5.185984027 podStartE2EDuration="5.185984027s" podCreationTimestamp="2026-02-28 09:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.14027071 +0000 UTC m=+101.830840057" watchObservedRunningTime="2026-02-28 09:05:10.185984027 +0000 UTC m=+101.876553364" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.196424 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39466: no serving certificate available for the kubelet" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.221061 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2kghk" podStartSLOduration=5.221040926 podStartE2EDuration="5.221040926s" podCreationTimestamp="2026-02-28 09:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.18376942 +0000 UTC m=+101.874338757" watchObservedRunningTime="2026-02-28 09:05:10.221040926 +0000 UTC m=+101.911610263" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.272449 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.272703 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.772677578 +0000 UTC m=+102.463246915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.273070 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.273446 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.7734335 +0000 UTC m=+102.464002837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.287758 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lmsz6" podStartSLOduration=73.287745837 podStartE2EDuration="1m13.287745837s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.267213768 +0000 UTC m=+101.957783105" watchObservedRunningTime="2026-02-28 09:05:10.287745837 +0000 UTC m=+101.978315175" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.289934 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4m8kh"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.298728 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p4ft4" podStartSLOduration=73.298719043 podStartE2EDuration="1m13.298719043s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.298083909 +0000 UTC m=+101.988653266" watchObservedRunningTime="2026-02-28 09:05:10.298719043 +0000 UTC m=+101.989288380" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.336852 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zhdhr"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.338506 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-47grc"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.344528 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w9g6h" podStartSLOduration=73.344504096 podStartE2EDuration="1m13.344504096s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.339089158 +0000 UTC m=+102.029658495" watchObservedRunningTime="2026-02-28 09:05:10.344504096 +0000 UTC m=+102.035073434" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.349952 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q"] Feb 28 09:05:10 crc kubenswrapper[4687]: W0228 09:05:10.362120 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d34ddd_da09_46e8_a9d5_5f395dbe8625.slice/crio-e295880c9e670d2e3a379dabb786bc971d87858f4b4c0b7b624f5ef110248d08 WatchSource:0}: Error finding container e295880c9e670d2e3a379dabb786bc971d87858f4b4c0b7b624f5ef110248d08: Status 404 returned error can't find the container with id e295880c9e670d2e3a379dabb786bc971d87858f4b4c0b7b624f5ef110248d08 Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.368726 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" event={"ID":"a8601fc9-0325-4c09-951c-afdda2eb2e05","Type":"ContainerStarted","Data":"b52eaa730212d4d0a654cfedb662036ec3ad5b2adb0af5ddc8fd2c19b10121a0"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.368767 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" event={"ID":"a8601fc9-0325-4c09-951c-afdda2eb2e05","Type":"ContainerStarted","Data":"fd803c3e176d78218c5bd06a21ee4f685b4f2383957dd9fb8693bd2eca97af57"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.368790 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" event={"ID":"a8601fc9-0325-4c09-951c-afdda2eb2e05","Type":"ContainerStarted","Data":"0b2d604e7d48e11cac473e48e6d71903bfad7fa791c6a24241f1d9bd1ac7ea80"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.374532 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.374763 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.374814 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.374847 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.374994 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.375589 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zrtwj" event={"ID":"55b1fe7b-e164-4f79-835b-0cc128a680eb","Type":"ContainerStarted","Data":"d5590e3ac9a8f0a78e30f5e0bad0a809d5615c8ba29723d0316b669da7371344"} Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.376219 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:10.876201083 +0000 UTC m=+102.566770420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.378294 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.378400 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.380846 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.387536 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" event={"ID":"7461d892-4781-495c-b78f-5fe375ed4f44","Type":"ContainerStarted","Data":"9b9ce20de6e842a48765283385ebf7b30ead0d800dc524025aa03a6db29c4caa"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.387572 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" event={"ID":"7461d892-4781-495c-b78f-5fe375ed4f44","Type":"ContainerStarted","Data":"aa76173f5814b211619e53a3c3d3605220e15154d8b050e710c61c27a151179f"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.391489 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.391933 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.392863 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.403721 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m8kh" event={"ID":"96e679f2-11c5-4ade-abc4-56a7b85a5668","Type":"ContainerStarted","Data":"7cb40cdab6c6d995623e6531ade87d03b25c4923780123a6e174ffd2b135d5f6"} Feb 28 09:05:10 crc kubenswrapper[4687]: W0228 09:05:10.409807 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaad2142e_fd55_41f2_96ca_f43b0362c071.slice/crio-dea9668a670063aeeb9c02ec98ce2f2f83467e440113190d702b4ddb74882509 WatchSource:0}: Error finding container dea9668a670063aeeb9c02ec98ce2f2f83467e440113190d702b4ddb74882509: Status 404 returned error can't find the container with id dea9668a670063aeeb9c02ec98ce2f2f83467e440113190d702b4ddb74882509 Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.420904 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5grmr" podStartSLOduration=73.420878812 podStartE2EDuration="1m13.420878812s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.417621925 +0000 UTC m=+102.108191282" watchObservedRunningTime="2026-02-28 09:05:10.420878812 +0000 UTC m=+102.111448149" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.433713 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.441517 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" event={"ID":"8a30ffde-a939-4553-9c76-62164e19d8c6","Type":"ContainerStarted","Data":"f9f9d9726289c2d23c9bbd69e7540b9598917ee9c04e36852f5a6129445bee25"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.441592 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" event={"ID":"8a30ffde-a939-4553-9c76-62164e19d8c6","Type":"ContainerStarted","Data":"b9b5fad257c2ffa63cdca484644cde5f2b07ddfb93fe03b096e93352295c9b7d"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.465099 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.465507 4687 generic.go:334] "Generic (PLEG): container finished" podID="0654f4f1-e605-4c0a-9e24-90b1ce4fd440" containerID="0bd168afd60933ccd749627831e333ca3e13b833aa585bfffb23c68b25753318" exitCode=0 Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.466967 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" event={"ID":"0654f4f1-e605-4c0a-9e24-90b1ce4fd440","Type":"ContainerDied","Data":"0bd168afd60933ccd749627831e333ca3e13b833aa585bfffb23c68b25753318"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.466999 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" event={"ID":"0654f4f1-e605-4c0a-9e24-90b1ce4fd440","Type":"ContainerStarted","Data":"cb627879c8346a700ecbaf1da5dcb87e7da9cc5a05d5e6b81b148816c54453cf"} Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.468588 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.490924 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.491795 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.494149 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.508530 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz"] Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.522582 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.02256493 +0000 UTC m=+102.713134267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.530495 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9thbt"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.538069 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3-metrics-certs\") pod \"network-metrics-daemon-7h597\" (UID: \"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3\") " pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.539951 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.575512 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7h597" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.577330 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.579091 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.583851 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.590318 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.592687 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.593077 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5xt25"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.593357 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.594228 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.09421177 +0000 UTC m=+102.784781108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.600115 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-26znf"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.614676 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kvzpk"] Feb 28 09:05:10 crc kubenswrapper[4687]: W0228 09:05:10.616389 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0596eca_6aad_4812_8c5c_06c0ab0ae911.slice/crio-bfa5cba944736529708fbe042d1b404373999ea8f9dc7698cfb65bad5d970e0b WatchSource:0}: Error finding container bfa5cba944736529708fbe042d1b404373999ea8f9dc7698cfb65bad5d970e0b: Status 404 returned error can't find the container with id bfa5cba944736529708fbe042d1b404373999ea8f9dc7698cfb65bad5d970e0b Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.619201 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8vhfl"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.622223 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.624435 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gqppb" podStartSLOduration=73.624423853 podStartE2EDuration="1m13.624423853s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.589010956 +0000 UTC m=+102.279580303" watchObservedRunningTime="2026-02-28 09:05:10.624423853 +0000 UTC m=+102.314993190" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.625479 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q26sq"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.626651 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rsgcf" podStartSLOduration=73.626645855 podStartE2EDuration="1m13.626645855s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.622607296 +0000 UTC m=+102.313176643" watchObservedRunningTime="2026-02-28 09:05:10.626645855 +0000 UTC m=+102.317215192" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.627156 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.659001 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xptn" podStartSLOduration=73.658989187 podStartE2EDuration="1m13.658989187s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.658224248 +0000 UTC m=+102.348793585" watchObservedRunningTime="2026-02-28 09:05:10.658989187 +0000 UTC m=+102.349558525" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.695428 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.695768 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.195752756 +0000 UTC m=+102.886322094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.696311 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tx86n"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.718218 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fb24z" podStartSLOduration=73.718196914 podStartE2EDuration="1m13.718196914s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.714931761 +0000 UTC m=+102.405501098" watchObservedRunningTime="2026-02-28 09:05:10.718196914 +0000 UTC m=+102.408766251" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.718437 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.801515 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.801767 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.301751368 +0000 UTC m=+102.992320695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.808231 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.811393 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:10 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:10 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:10 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.811433 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.920319 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:10 crc kubenswrapper[4687]: E0228 09:05:10.920605 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.420588936 +0000 UTC m=+103.111158274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.925457 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zrtwj" podStartSLOduration=73.92543879 podStartE2EDuration="1m13.92543879s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.919620964 +0000 UTC m=+102.610190321" watchObservedRunningTime="2026-02-28 09:05:10.92543879 +0000 UTC m=+102.616008127" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.926976 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nkgl2"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.927799 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.932706 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.936831 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkgl2"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.977149 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-494jw" podStartSLOduration=73.977131789 podStartE2EDuration="1m13.977131789s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:10.975459312 +0000 UTC m=+102.666028659" watchObservedRunningTime="2026-02-28 09:05:10.977131789 +0000 UTC m=+102.667701126" Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.992723 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7sr6"] Feb 28 09:05:10 crc kubenswrapper[4687]: I0228 09:05:10.995195 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.004497 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7sr6"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022002 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022503 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92df9\" (UniqueName: \"kubernetes.io/projected/19def7b9-fb5d-4e49-98db-784814aa9769-kube-api-access-92df9\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022603 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-catalog-content\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022742 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-catalog-content\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022833 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-utilities\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022905 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn6hl\" (UniqueName: \"kubernetes.io/projected/556a0190-2912-4b71-a5ae-70c614769f9d-kube-api-access-tn6hl\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.022971 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-utilities\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.023916 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.523897617 +0000 UTC m=+103.214466953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.030911 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128451 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92df9\" (UniqueName: \"kubernetes.io/projected/19def7b9-fb5d-4e49-98db-784814aa9769-kube-api-access-92df9\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128711 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-catalog-content\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128840 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-catalog-content\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128870 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-utilities\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128890 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn6hl\" (UniqueName: \"kubernetes.io/projected/556a0190-2912-4b71-a5ae-70c614769f9d-kube-api-access-tn6hl\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128911 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-utilities\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.128941 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.129379 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.629363867 +0000 UTC m=+103.319933194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.129843 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-catalog-content\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.130160 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-catalog-content\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.130366 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-utilities\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.130668 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-utilities\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.175314 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb8xd"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.183797 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7l47s"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.185095 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.190259 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn6hl\" (UniqueName: \"kubernetes.io/projected/556a0190-2912-4b71-a5ae-70c614769f9d-kube-api-access-tn6hl\") pod \"certified-operators-nkgl2\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.200942 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7l47s"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.219280 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92df9\" (UniqueName: \"kubernetes.io/projected/19def7b9-fb5d-4e49-98db-784814aa9769-kube-api-access-92df9\") pod \"community-operators-f7sr6\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.231865 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.232223 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.73220096 +0000 UTC m=+103.422770298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.232304 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.232353 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-catalog-content\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.232410 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-utilities\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.232527 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjhz\" (UniqueName: \"kubernetes.io/projected/f193be8c-c2cf-4d79-ac3d-fed262658077-kube-api-access-4mjhz\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.232725 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.732711852 +0000 UTC m=+103.423281189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.235046 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7h597"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.272430 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.294719 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.333431 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.333649 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.833627802 +0000 UTC m=+103.524197138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.333725 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjhz\" (UniqueName: \"kubernetes.io/projected/f193be8c-c2cf-4d79-ac3d-fed262658077-kube-api-access-4mjhz\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.333827 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.333852 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-catalog-content\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.333917 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-utilities\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.334383 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-utilities\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.334710 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.834702253 +0000 UTC m=+103.525271590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.334986 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-catalog-content\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.382951 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k67sw"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.385179 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.396075 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjhz\" (UniqueName: \"kubernetes.io/projected/f193be8c-c2cf-4d79-ac3d-fed262658077-kube-api-access-4mjhz\") pod \"certified-operators-7l47s\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.411872 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67sw"] Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.434941 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.435352 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-utilities\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.435432 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-catalog-content\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.435499 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:11.935474521 +0000 UTC m=+103.626043858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.435597 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgrq\" (UniqueName: \"kubernetes.io/projected/68c495db-6852-4932-996a-053d7c113f22-kube-api-access-zkgrq\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.514584 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" event={"ID":"36a32d28-84e1-4c44-b2e5-546c8a1c8853","Type":"ContainerStarted","Data":"85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.514799 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" event={"ID":"36a32d28-84e1-4c44-b2e5-546c8a1c8853","Type":"ContainerStarted","Data":"593598127fe594baf35099293c658e6c1477e5aab712b4346785b542fc00758c"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.515622 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.530464 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7h597" event={"ID":"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3","Type":"ContainerStarted","Data":"29c8c51321b52ad4ac95fb86fb31206f0ac73654cdad54628508c34d3034efbd"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.545044 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-catalog-content\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.545144 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkgrq\" (UniqueName: \"kubernetes.io/projected/68c495db-6852-4932-996a-053d7c113f22-kube-api-access-zkgrq\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.545191 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.545238 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-utilities\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.560860 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-utilities\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.561064 4687 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5xt25 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.561138 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.561297 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39468: no serving certificate available for the kubelet" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.562822 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-catalog-content\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.563342 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.063328734 +0000 UTC m=+103.753898071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.572655 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" event={"ID":"36c2be94-93ed-4fba-9bcd-e0ebe892909e","Type":"ContainerStarted","Data":"7f3964adebe950dfd6eb1f6bf41487fefea4b334be85519f2fae597bfcae5ccd"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.572709 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" event={"ID":"36c2be94-93ed-4fba-9bcd-e0ebe892909e","Type":"ContainerStarted","Data":"39ad8358b735c18d08daf87070def598d2dcbda33fb49dd382a27961b9de617f"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.573309 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.581206 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kvzpk" event={"ID":"75b60b2e-cda7-4a73-bf67-117363db768a","Type":"ContainerStarted","Data":"624968ebb33b292966e583c9579fa6f432c540fe1c7ebcd267a1b707d5aa36f5"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.581311 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" event={"ID":"63d9a180-a0d1-474e-a850-9a4235c5ac62","Type":"ContainerStarted","Data":"0f94169ccee0bd1a05aa12a6c26d9cc41201325cb728acde8e6f3f00fa6950f3"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.574809 4687 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-9b6d5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.581454 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" podUID="36c2be94-93ed-4fba-9bcd-e0ebe892909e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.619724 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkgrq\" (UniqueName: \"kubernetes.io/projected/68c495db-6852-4932-996a-053d7c113f22-kube-api-access-zkgrq\") pod \"community-operators-k67sw\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.622708 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" event={"ID":"8a30ffde-a939-4553-9c76-62164e19d8c6","Type":"ContainerStarted","Data":"99b159f0956def3f8ce76e0e9542b65172c61a0f51f27dec80555dfb250c93cf"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.633988 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" event={"ID":"aad2142e-fd55-41f2-96ca-f43b0362c071","Type":"ContainerStarted","Data":"d99bcda7578ff81ece4ef2cde7ce3b480a7fc3cd7d18f36be68061a6d84b7e89"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.634345 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" event={"ID":"aad2142e-fd55-41f2-96ca-f43b0362c071","Type":"ContainerStarted","Data":"dea9668a670063aeeb9c02ec98ce2f2f83467e440113190d702b4ddb74882509"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.654054 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-47grc" event={"ID":"3dbb3aa8-4352-49d1-b693-10281b8e4fac","Type":"ContainerStarted","Data":"4380e1585b24aea9e2727dc3b96b03288970ae92c1250bc28c233f35c3afde35"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.654081 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-47grc" event={"ID":"3dbb3aa8-4352-49d1-b693-10281b8e4fac","Type":"ContainerStarted","Data":"6b479d57eae99244f2f57260cc20272474fa78e359c6176262bbcc953b4ba002"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.655034 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.655438 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.15539344 +0000 UTC m=+103.845962777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.662684 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.663281 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.163272624 +0000 UTC m=+103.853841961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.675655 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m8kh" event={"ID":"96e679f2-11c5-4ade-abc4-56a7b85a5668","Type":"ContainerStarted","Data":"170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.683874 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.687569 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" event={"ID":"09d34ddd-da09-46e8-a9d5-5f395dbe8625","Type":"ContainerStarted","Data":"3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.687694 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" event={"ID":"09d34ddd-da09-46e8-a9d5-5f395dbe8625","Type":"ContainerStarted","Data":"e295880c9e670d2e3a379dabb786bc971d87858f4b4c0b7b624f5ef110248d08"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.688786 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.692672 4687 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zhdhr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.692711 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" podUID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.702602 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" event={"ID":"4beb3d39-9d4e-4964-9567-67396e456053","Type":"ContainerStarted","Data":"882662b5cf44bc8f6a798ae32cc40201e34caef235bf0fafc6ec21dbb8df57a3"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.702646 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" event={"ID":"4beb3d39-9d4e-4964-9567-67396e456053","Type":"ContainerStarted","Data":"96e8ab5d36b12513b77e6a5d8c69b91a203740cf4e5c4d82e454e74b1d33d9af"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.711551 4687 generic.go:334] "Generic (PLEG): container finished" podID="f77b68ae-c1dd-481b-a831-d4698d8f44a0" containerID="b943bdb6b0a0125bebdde9448b2dfc8c566599b3796d51f5f74a54b1921e64e6" exitCode=0 Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.711617 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" event={"ID":"f77b68ae-c1dd-481b-a831-d4698d8f44a0","Type":"ContainerDied","Data":"b943bdb6b0a0125bebdde9448b2dfc8c566599b3796d51f5f74a54b1921e64e6"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.711642 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" event={"ID":"f77b68ae-c1dd-481b-a831-d4698d8f44a0","Type":"ContainerStarted","Data":"9a1e525c9c0d18aadaa27fc3effc7bec9669aa4b83b165c141e09311828a3bf9"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.718912 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.724176 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-26znf" event={"ID":"664a84bd-b59d-4f25-824f-12b593193cd2","Type":"ContainerStarted","Data":"311fbf2161f528e7dfa8778bc2b834b548586debfd7bd2b8036be7e70bcce3f8"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.763458 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.765743 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.265714153 +0000 UTC m=+103.956283490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.773298 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" event={"ID":"e899d87a-f034-4436-8409-ca04178918b7","Type":"ContainerStarted","Data":"b1e133cb349603baef2789db9f222243fb3ab6750fdf585b440c5145746afc3c"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.817188 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:11 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:11 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:11 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.817255 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:11 crc kubenswrapper[4687]: W0228 09:05:11.822524 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-649545c563fdee580834525a8eaa7265b72993ea5cee5bdaafdb67cf5cff511c WatchSource:0}: Error finding container 649545c563fdee580834525a8eaa7265b72993ea5cee5bdaafdb67cf5cff511c: Status 404 returned error can't find the container with id 649545c563fdee580834525a8eaa7265b72993ea5cee5bdaafdb67cf5cff511c Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.834213 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" event={"ID":"100b328c-d3fd-4a0f-82e5-428f29240fc4","Type":"ContainerStarted","Data":"d9ff37f5227904f08ca0a81483b7e974254f53e4d73f2889c1ac670f450107b1"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.836634 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" podStartSLOduration=74.836617435 podStartE2EDuration="1m14.836617435s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:11.823170695 +0000 UTC m=+103.513740032" watchObservedRunningTime="2026-02-28 09:05:11.836617435 +0000 UTC m=+103.527186772" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.842002 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" event={"ID":"e00474ba-3061-4d3c-8880-05e4d50d82ae","Type":"ContainerStarted","Data":"b9cb5ba0954babbe5d33e10466e1488473a21864110e274c9377d7407a39f9b1"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.859034 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" event={"ID":"a0596eca-6aad-4812-8c5c-06c0ab0ae911","Type":"ContainerStarted","Data":"bfa5cba944736529708fbe042d1b404373999ea8f9dc7698cfb65bad5d970e0b"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.867317 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.868415 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.368399741 +0000 UTC m=+104.058969079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.884860 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" event={"ID":"9c633c27-c00d-4436-8b95-c327bcf08a0c","Type":"ContainerStarted","Data":"3f090f5961b37afc511b9b21cd90cf614897f312115a9a6c9e34fb521b842faa"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.898585 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8vhfl" event={"ID":"4aa07587-0d38-4e29-92ef-c6957b5526a8","Type":"ContainerStarted","Data":"368ff5a83f9f72f37a1cdd83ce250bff0ecea056f82281094850d12e37f0e9c1"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.900139 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.905317 4687 patch_prober.go:28] interesting pod/downloads-7954f5f757-8vhfl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.905368 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8vhfl" podUID="4aa07587-0d38-4e29-92ef-c6957b5526a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.908113 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" podStartSLOduration=74.908098413 podStartE2EDuration="1m14.908098413s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:11.905103098 +0000 UTC m=+103.595672445" watchObservedRunningTime="2026-02-28 09:05:11.908098413 +0000 UTC m=+103.598667751" Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.929306 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" event={"ID":"9292d86c-b9c1-4a63-a766-c25874ffa2f5","Type":"ContainerStarted","Data":"6d1c5762da83747ed270bb93a0f410bc9574d0a55cdf29b0a83ec5d0bd59ee61"} Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.976138 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:11 crc kubenswrapper[4687]: E0228 09:05:11.976258 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.476236362 +0000 UTC m=+104.166805699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:11 crc kubenswrapper[4687]: I0228 09:05:11.976651 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.004652 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.50462265 +0000 UTC m=+104.195191987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.006251 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4m8kh" podStartSLOduration=75.006223993 podStartE2EDuration="1m15.006223993s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.005738429 +0000 UTC m=+103.696307766" watchObservedRunningTime="2026-02-28 09:05:12.006223993 +0000 UTC m=+103.696793330" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.053561 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" event={"ID":"a6ec553c-b9e9-4c6e-a1d1-6d730702968f","Type":"ContainerStarted","Data":"b7239ef8db6c7e94d22864e6974d9030aad9207ed2ab0ead5e8216ad59ca2de5"} Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.084049 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.084721 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.584174253 +0000 UTC m=+104.274743591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.086996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.090321 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.590300029 +0000 UTC m=+104.280869366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.093151 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkgl2"] Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.107618 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kz4nd" podStartSLOduration=75.107587715 podStartE2EDuration="1m15.107587715s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.035323101 +0000 UTC m=+103.725892458" watchObservedRunningTime="2026-02-28 09:05:12.107587715 +0000 UTC m=+103.798157052" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.124451 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bk6v7" podStartSLOduration=75.124436805 podStartE2EDuration="1m15.124436805s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.101830102 +0000 UTC m=+103.792399459" watchObservedRunningTime="2026-02-28 09:05:12.124436805 +0000 UTC m=+103.815006143" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.194541 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.195708 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.695680858 +0000 UTC m=+104.386250195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.195888 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.196294 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.696283822 +0000 UTC m=+104.386853159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.225293 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" podStartSLOduration=75.225278685 podStartE2EDuration="1m15.225278685s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.22294861 +0000 UTC m=+103.913517948" watchObservedRunningTime="2026-02-28 09:05:12.225278685 +0000 UTC m=+103.915848022" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.238037 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qns64" podStartSLOduration=75.238012814 podStartE2EDuration="1m15.238012814s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.196695737 +0000 UTC m=+103.887265074" watchObservedRunningTime="2026-02-28 09:05:12.238012814 +0000 UTC m=+103.928582151" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.290915 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-47grc" podStartSLOduration=7.290894659 podStartE2EDuration="7.290894659s" podCreationTimestamp="2026-02-28 09:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.28893401 +0000 UTC m=+103.979503357" watchObservedRunningTime="2026-02-28 09:05:12.290894659 +0000 UTC m=+103.981463995" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.296703 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.297302 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.797285724 +0000 UTC m=+104.487855060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.373674 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8vhfl" podStartSLOduration=75.37041401 podStartE2EDuration="1m15.37041401s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:12.327464092 +0000 UTC m=+104.018033439" watchObservedRunningTime="2026-02-28 09:05:12.37041401 +0000 UTC m=+104.060983348" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.379708 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7sr6"] Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.401530 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.401990 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:12.901972176 +0000 UTC m=+104.592541513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.504846 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.506565 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.006536059 +0000 UTC m=+104.697105395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.508575 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67sw"] Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.509874 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7l47s"] Feb 28 09:05:12 crc kubenswrapper[4687]: W0228 09:05:12.547894 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf193be8c_c2cf_4d79_ac3d_fed262658077.slice/crio-430421ce6c3f665f88b6cc3c6ee3c75759e5a7496f43f54297ad5fb7eb5f1192 WatchSource:0}: Error finding container 430421ce6c3f665f88b6cc3c6ee3c75759e5a7496f43f54297ad5fb7eb5f1192: Status 404 returned error can't find the container with id 430421ce6c3f665f88b6cc3c6ee3c75759e5a7496f43f54297ad5fb7eb5f1192 Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.608288 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.608624 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.108613924 +0000 UTC m=+104.799183261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.659752 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.660172 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.713171 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.715938 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.215902533 +0000 UTC m=+104.906471869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.772982 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svtsw"] Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.780982 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.788332 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.811599 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:12 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:12 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:12 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.811650 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.817085 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-utilities\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.817173 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.817232 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-catalog-content\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.817251 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wgdh\" (UniqueName: \"kubernetes.io/projected/9a9c467e-d2ff-4322-bc25-5cfe38dff784-kube-api-access-2wgdh\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.817566 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.3175507 +0000 UTC m=+105.008120037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.875839 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svtsw"] Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.920594 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.920648 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.420631442 +0000 UTC m=+105.111200778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.920990 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-catalog-content\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.921051 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wgdh\" (UniqueName: \"kubernetes.io/projected/9a9c467e-d2ff-4322-bc25-5cfe38dff784-kube-api-access-2wgdh\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.921124 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-utilities\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.921238 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:12 crc kubenswrapper[4687]: E0228 09:05:12.921558 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.421546714 +0000 UTC m=+105.112116051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.921569 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-catalog-content\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.922241 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-utilities\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:12 crc kubenswrapper[4687]: I0228 09:05:12.950461 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wgdh\" (UniqueName: \"kubernetes.io/projected/9a9c467e-d2ff-4322-bc25-5cfe38dff784-kube-api-access-2wgdh\") pod \"redhat-marketplace-svtsw\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.022689 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.023161 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.523147001 +0000 UTC m=+105.213716337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.072786 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" event={"ID":"63d9a180-a0d1-474e-a850-9a4235c5ac62","Type":"ContainerStarted","Data":"373db4dcde5ffe69eb461ed0c491d985114f459b54bc4a0e46f63292dee775d4"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.072829 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" event={"ID":"63d9a180-a0d1-474e-a850-9a4235c5ac62","Type":"ContainerStarted","Data":"8099d8b4057b565b65e6a6c55d250ef2062d5c9a2c5bb9f03b80dfecd985616a"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.075678 4687 generic.go:334] "Generic (PLEG): container finished" podID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerID="36cee0dd39e5c0666fee9deec291840ff3de8af7e94270b6a64014c49c8eda58" exitCode=0 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.075722 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7l47s" event={"ID":"f193be8c-c2cf-4d79-ac3d-fed262658077","Type":"ContainerDied","Data":"36cee0dd39e5c0666fee9deec291840ff3de8af7e94270b6a64014c49c8eda58"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.075739 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7l47s" event={"ID":"f193be8c-c2cf-4d79-ac3d-fed262658077","Type":"ContainerStarted","Data":"430421ce6c3f665f88b6cc3c6ee3c75759e5a7496f43f54297ad5fb7eb5f1192"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.077260 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.080276 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8vhfl" event={"ID":"4aa07587-0d38-4e29-92ef-c6957b5526a8","Type":"ContainerStarted","Data":"cdac9a39ed5e9480c18b78552ac043f53ec5bd9d8290e5e9e677acd8576247f3"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.083190 4687 patch_prober.go:28] interesting pod/downloads-7954f5f757-8vhfl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.083242 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8vhfl" podUID="4aa07587-0d38-4e29-92ef-c6957b5526a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.090411 4687 generic.go:334] "Generic (PLEG): container finished" podID="68c495db-6852-4932-996a-053d7c113f22" containerID="ee11752ea57cff8fbbf450d7a5e8036af1da6ca5769933a005dee76de0968284" exitCode=0 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.090510 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67sw" event={"ID":"68c495db-6852-4932-996a-053d7c113f22","Type":"ContainerDied","Data":"ee11752ea57cff8fbbf450d7a5e8036af1da6ca5769933a005dee76de0968284"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.090563 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67sw" event={"ID":"68c495db-6852-4932-996a-053d7c113f22","Type":"ContainerStarted","Data":"c004fca93b67d0e542a5baf69a27021c115b2aad4c1022c9267620e18e2e89ab"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.093134 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" event={"ID":"0654f4f1-e605-4c0a-9e24-90b1ce4fd440","Type":"ContainerStarted","Data":"edab9c31cc2e9ad039a4b30f0bb0663fd97afb711a25e55455f7b0e0c412086c"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.093164 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" event={"ID":"0654f4f1-e605-4c0a-9e24-90b1ce4fd440","Type":"ContainerStarted","Data":"274e99efed1aa55cf1a480e0aca77234643798e9eba8c0738b1beaaa0b4955a3"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.095895 4687 generic.go:334] "Generic (PLEG): container finished" podID="556a0190-2912-4b71-a5ae-70c614769f9d" containerID="fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3" exitCode=0 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.095943 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkgl2" event={"ID":"556a0190-2912-4b71-a5ae-70c614769f9d","Type":"ContainerDied","Data":"fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.095965 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkgl2" event={"ID":"556a0190-2912-4b71-a5ae-70c614769f9d","Type":"ContainerStarted","Data":"85c58770d4bc3b236df8ddb1c1b1bb88ad29cde19f8389f84330e429a725cb2d"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.098800 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"64e3a2282ac40964a47e0b0d4a6fe512258a9ae897bc608f08c5ae335e9bf644"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.098825 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b2cfd50724aa8ea3eeb407cd27731d483a24064b84896e3994829dea3e367c8e"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.110515 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vx4hz" podStartSLOduration=76.11049933 podStartE2EDuration="1m16.11049933s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.100947478 +0000 UTC m=+104.791516816" watchObservedRunningTime="2026-02-28 09:05:13.11049933 +0000 UTC m=+104.801068667" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.124932 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.125436 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" event={"ID":"f77b68ae-c1dd-481b-a831-d4698d8f44a0","Type":"ContainerStarted","Data":"3c7a7cf7fd6a093455b0cdd33e037e1f743fcd81737b7800f9cd51468f17bad8"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.126152 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.128389 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.62837352 +0000 UTC m=+105.318942857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.138346 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" event={"ID":"100b328c-d3fd-4a0f-82e5-428f29240fc4","Type":"ContainerStarted","Data":"ee72f07c128d7a4fd1bfc8398f016d61244130110169e1aa2b84e38a21ff5a7f"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.146612 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" event={"ID":"a0596eca-6aad-4812-8c5c-06c0ab0ae911","Type":"ContainerStarted","Data":"750b58f2078406bae0af46eab345316c99d6f5e8d233cd6b53f39f5e29a154ba"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.146662 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" event={"ID":"a0596eca-6aad-4812-8c5c-06c0ab0ae911","Type":"ContainerStarted","Data":"ace8d2de1c17b865f1dd02a5b86895ad9b5056825350a5d031564e6a71675020"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.147980 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.150921 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.156510 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" event={"ID":"e00474ba-3061-4d3c-8880-05e4d50d82ae","Type":"ContainerStarted","Data":"2d548e6ef4ec35bf665ffd47734abd801010e095f2aaa7856931cfbd0fc5682a"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.156551 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" event={"ID":"e00474ba-3061-4d3c-8880-05e4d50d82ae","Type":"ContainerStarted","Data":"f1909007239974bcd855db66c2514fd3ebf1cd2df75bd6e7208cd64552bfc6b7"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.167801 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8gpq"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.168348 4687 generic.go:334] "Generic (PLEG): container finished" podID="19def7b9-fb5d-4e49-98db-784814aa9769" containerID="4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a" exitCode=0 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.168901 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7sr6" event={"ID":"19def7b9-fb5d-4e49-98db-784814aa9769","Type":"ContainerDied","Data":"4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.168987 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7sr6" event={"ID":"19def7b9-fb5d-4e49-98db-784814aa9769","Type":"ContainerStarted","Data":"e1bab61eeb1ab6f618584924a2c5e62e42dbec831ffc6cd5dbec9187740bca27"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.169178 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.172453 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" podStartSLOduration=76.172438567 podStartE2EDuration="1m16.172438567s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.168148043 +0000 UTC m=+104.858717401" watchObservedRunningTime="2026-02-28 09:05:13.172438567 +0000 UTC m=+104.863007894" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.179588 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7h597" event={"ID":"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3","Type":"ContainerStarted","Data":"3b288bf44bf727f81d379a8c719e9926244b5604a3dddf663da603ac6f8ef2b0"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.186876 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8gpq"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.215205 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-26znf" event={"ID":"664a84bd-b59d-4f25-824f-12b593193cd2","Type":"ContainerStarted","Data":"29e3dc8857f26a796d6d30b125e5e1d7475a49aea18a97382898bd62f042d0c1"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.226003 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.226294 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v79xk\" (UniqueName: \"kubernetes.io/projected/512bb25a-8693-4a78-afcc-77e005a73c0f-kube-api-access-v79xk\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.226355 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-catalog-content\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.226379 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-utilities\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.227187 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.727166665 +0000 UTC m=+105.417736002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.235849 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" event={"ID":"9292d86c-b9c1-4a63-a766-c25874ffa2f5","Type":"ContainerStarted","Data":"0f6fc0baec825b631034d330a5884a6e66418a968350687b5b8375cee5a43a5c"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.235895 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" event={"ID":"9292d86c-b9c1-4a63-a766-c25874ffa2f5","Type":"ContainerStarted","Data":"9ba691097f7696cb1471e8eaabfb3ddd81058fb0ead302a4fc69b2bba80527a3"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.257692 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" event={"ID":"a6ec553c-b9e9-4c6e-a1d1-6d730702968f","Type":"ContainerStarted","Data":"dd285bbadcdc3946747921ea5d460eeeedc135f034d471837aa3f1a18ac6216a"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.257770 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" event={"ID":"a6ec553c-b9e9-4c6e-a1d1-6d730702968f","Type":"ContainerStarted","Data":"172f5b4cedd4894b5ef72ba203b9234bb4c5bd8269cae6b5b3b4e95b26e5a013"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.263785 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hhlxz" podStartSLOduration=76.263765223 podStartE2EDuration="1m16.263765223s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.261433006 +0000 UTC m=+104.952002362" watchObservedRunningTime="2026-02-28 09:05:13.263765223 +0000 UTC m=+104.954334560" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.278582 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-q26sq" podStartSLOduration=76.278564187 podStartE2EDuration="1m16.278564187s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.276255363 +0000 UTC m=+104.966824721" watchObservedRunningTime="2026-02-28 09:05:13.278564187 +0000 UTC m=+104.969133525" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.280168 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" event={"ID":"e899d87a-f034-4436-8409-ca04178918b7","Type":"ContainerStarted","Data":"a60b310cdec6733ac239fb8523ff17cee642452cef4858be568fedb8075066e1"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.282007 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c294e75d7b2a5a9e1ad87b1aa1b9225d2f6f6d116dafd1106b652d731c329ced"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.282052 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b64f73ad581ae118583fbd0faea0cae4252267c22e50cb36eae132d624203de7"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.282571 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.293723 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kvzpk" event={"ID":"75b60b2e-cda7-4a73-bf67-117363db768a","Type":"ContainerStarted","Data":"5053eeaf7ae84667b42af87eab8623eac52a6a73cb18e066fa3ce9cb76317cce"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.293757 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kvzpk" event={"ID":"75b60b2e-cda7-4a73-bf67-117363db768a","Type":"ContainerStarted","Data":"b4f6aa3d71b1585114a367782fc49b236938cc3b586af65f7a8f490b9e185832"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.293873 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.302983 4687 generic.go:334] "Generic (PLEG): container finished" podID="9c633c27-c00d-4436-8b95-c327bcf08a0c" containerID="e6da36fe23b3bc06b0c399c9b2dba1efb62464d3e017312eab761f91101ed3d8" exitCode=0 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.303052 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" event={"ID":"9c633c27-c00d-4436-8b95-c327bcf08a0c","Type":"ContainerDied","Data":"e6da36fe23b3bc06b0c399c9b2dba1efb62464d3e017312eab761f91101ed3d8"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.304884 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" podStartSLOduration=76.304863969 podStartE2EDuration="1m16.304863969s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.303492158 +0000 UTC m=+104.994061505" watchObservedRunningTime="2026-02-28 09:05:13.304863969 +0000 UTC m=+104.995433307" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.315955 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" podUID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" containerName="controller-manager" containerID="cri-o://6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b" gracePeriod=30 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.316428 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" gracePeriod=30 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.318417 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a5ef822c6e07a83555d5b7fdb2f3737feac481272a19423d2888784190b651a9"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.319072 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"649545c563fdee580834525a8eaa7265b72993ea5cee5bdaafdb67cf5cff511c"} Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.318524 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" podStartSLOduration=76.318511086 podStartE2EDuration="1m16.318511086s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.31752426 +0000 UTC m=+105.008093607" watchObservedRunningTime="2026-02-28 09:05:13.318511086 +0000 UTC m=+105.009080423" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.318448 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" podUID="04e92ab6-5602-4d97-9e70-f95ff9769a79" containerName="route-controller-manager" containerID="cri-o://6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3" gracePeriod=30 Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.323112 4687 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5xt25 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.323156 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.328243 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9b6d5" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.328396 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.328826 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-utilities\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.328892 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.329002 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v79xk\" (UniqueName: \"kubernetes.io/projected/512bb25a-8693-4a78-afcc-77e005a73c0f-kube-api-access-v79xk\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.329116 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-catalog-content\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.329452 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-catalog-content\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.330288 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-utilities\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.330832 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.830821727 +0000 UTC m=+105.521391065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.345227 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9thbt" podStartSLOduration=76.345207344 podStartE2EDuration="1m16.345207344s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.344752878 +0000 UTC m=+105.035322215" watchObservedRunningTime="2026-02-28 09:05:13.345207344 +0000 UTC m=+105.035776680" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.362357 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v79xk\" (UniqueName: \"kubernetes.io/projected/512bb25a-8693-4a78-afcc-77e005a73c0f-kube-api-access-v79xk\") pod \"redhat-marketplace-f8gpq\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.366615 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.366969 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.428583 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w7jmh" podStartSLOduration=76.42856631 podStartE2EDuration="1m16.42856631s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.425907498 +0000 UTC m=+105.116476834" watchObservedRunningTime="2026-02-28 09:05:13.42856631 +0000 UTC m=+105.119135648" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.431478 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.446475 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:13.946450198 +0000 UTC m=+105.637019535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.530434 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.533767 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.534160 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.034148418 +0000 UTC m=+105.724717755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.558730 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.559548 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.568290 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.584528 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.591616 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kvzpk" podStartSLOduration=8.591606513 podStartE2EDuration="8.591606513s" podCreationTimestamp="2026-02-28 09:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.589314801 +0000 UTC m=+105.279884148" watchObservedRunningTime="2026-02-28 09:05:13.591606513 +0000 UTC m=+105.282175850" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.593832 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.637611 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.637840 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308fcc37-8d88-4077-9f16-e82ab7f6d067-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.637911 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308fcc37-8d88-4077-9f16-e82ab7f6d067-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.638039 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.138000392 +0000 UTC m=+105.828569729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.684151 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.700710 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" podStartSLOduration=76.70069585 podStartE2EDuration="1m16.70069585s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.697161079 +0000 UTC m=+105.387730426" watchObservedRunningTime="2026-02-28 09:05:13.70069585 +0000 UTC m=+105.391265188" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.739717 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308fcc37-8d88-4077-9f16-e82ab7f6d067-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.740117 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308fcc37-8d88-4077-9f16-e82ab7f6d067-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.740168 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.740514 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.240499449 +0000 UTC m=+105.931068786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.740676 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308fcc37-8d88-4077-9f16-e82ab7f6d067-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.760322 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svtsw"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.774456 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308fcc37-8d88-4077-9f16-e82ab7f6d067-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: W0228 09:05:13.775231 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9c467e_d2ff_4322_bc25_5cfe38dff784.slice/crio-afb5a525dca570e091fd8a08e97d3c21592dddb59f94b5917f1d7ccb0a3e32da WatchSource:0}: Error finding container afb5a525dca570e091fd8a08e97d3c21592dddb59f94b5917f1d7ccb0a3e32da: Status 404 returned error can't find the container with id afb5a525dca570e091fd8a08e97d3c21592dddb59f94b5917f1d7ccb0a3e32da Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.811889 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.811868928 podStartE2EDuration="811.868928ms" podCreationTimestamp="2026-02-28 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:13.800545553 +0000 UTC m=+105.491114911" watchObservedRunningTime="2026-02-28 09:05:13.811868928 +0000 UTC m=+105.502438265" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.812195 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:13 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:13 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:13 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.812240 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.842001 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.842153 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.342129842 +0000 UTC m=+106.032699179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.842335 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.842708 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.342695977 +0000 UTC m=+106.033265315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.939168 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.946643 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:13 crc kubenswrapper[4687]: E0228 09:05:13.946982 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.446965086 +0000 UTC m=+106.137534422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.973642 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npwxl"] Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.974759 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.976313 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 09:05:13 crc kubenswrapper[4687]: I0228 09:05:13.988372 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npwxl"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.047870 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-catalog-content\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.047915 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-utilities\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.047982 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.048012 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzd4r\" (UniqueName: \"kubernetes.io/projected/69eb70ff-d8c7-4dba-9f8e-1969b7947640-kube-api-access-jzd4r\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.048386 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.548371558 +0000 UTC m=+106.238940895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.052053 4687 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bqdqx container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]log ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]etcd ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/generic-apiserver-start-informers ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/max-in-flight-filter ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 28 09:05:14 crc kubenswrapper[4687]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 28 09:05:14 crc kubenswrapper[4687]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/project.openshift.io-projectcache ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/openshift.io-startinformers ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 28 09:05:14 crc kubenswrapper[4687]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 28 09:05:14 crc kubenswrapper[4687]: livez check failed Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.052119 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" podUID="0654f4f1-e605-4c0a-9e24-90b1ce4fd440" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.140877 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.152345 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.152555 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzd4r\" (UniqueName: \"kubernetes.io/projected/69eb70ff-d8c7-4dba-9f8e-1969b7947640-kube-api-access-jzd4r\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.152602 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-catalog-content\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.152623 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-utilities\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.153231 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-utilities\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.153259 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.653226628 +0000 UTC m=+106.343795965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.153478 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-catalog-content\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.153606 4687 ???:1] "http: TLS handshake error from 192.168.126.11:39470: no serving certificate available for the kubelet" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.173282 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.188045 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl"] Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.188242 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" containerName="controller-manager" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.188259 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" containerName="controller-manager" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.188275 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e92ab6-5602-4d97-9e70-f95ff9769a79" containerName="route-controller-manager" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.188281 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e92ab6-5602-4d97-9e70-f95ff9769a79" containerName="route-controller-manager" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.188357 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e92ab6-5602-4d97-9e70-f95ff9769a79" containerName="route-controller-manager" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.188369 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" containerName="controller-manager" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.211795 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzd4r\" (UniqueName: \"kubernetes.io/projected/69eb70ff-d8c7-4dba-9f8e-1969b7947640-kube-api-access-jzd4r\") pod \"redhat-operators-npwxl\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.212561 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254170 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254806 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-client-ca\") pod \"04e92ab6-5602-4d97-9e70-f95ff9769a79\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254870 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-proxy-ca-bundles\") pod \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254889 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-client-ca\") pod \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254925 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-config\") pod \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254952 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr4lw\" (UniqueName: \"kubernetes.io/projected/eaa6a825-72b4-4544-9e19-5af6b2c7648e-kube-api-access-vr4lw\") pod \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.254978 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e92ab6-5602-4d97-9e70-f95ff9769a79-serving-cert\") pod \"04e92ab6-5602-4d97-9e70-f95ff9769a79\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255044 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-config\") pod \"04e92ab6-5602-4d97-9e70-f95ff9769a79\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255070 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa6a825-72b4-4544-9e19-5af6b2c7648e-serving-cert\") pod \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\" (UID: \"eaa6a825-72b4-4544-9e19-5af6b2c7648e\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255103 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wxsl\" (UniqueName: \"kubernetes.io/projected/04e92ab6-5602-4d97-9e70-f95ff9769a79-kube-api-access-6wxsl\") pod \"04e92ab6-5602-4d97-9e70-f95ff9769a79\" (UID: \"04e92ab6-5602-4d97-9e70-f95ff9769a79\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255319 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-client-ca\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255374 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p59kc\" (UniqueName: \"kubernetes.io/projected/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-kube-api-access-p59kc\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255443 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255467 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-proxy-ca-bundles\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255492 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-serving-cert\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.255536 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-config\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.256428 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eaa6a825-72b4-4544-9e19-5af6b2c7648e" (UID: "eaa6a825-72b4-4544-9e19-5af6b2c7648e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.257298 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-client-ca" (OuterVolumeSpecName: "client-ca") pod "eaa6a825-72b4-4544-9e19-5af6b2c7648e" (UID: "eaa6a825-72b4-4544-9e19-5af6b2c7648e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.256553 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-config" (OuterVolumeSpecName: "config") pod "eaa6a825-72b4-4544-9e19-5af6b2c7648e" (UID: "eaa6a825-72b4-4544-9e19-5af6b2c7648e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.260048 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e92ab6-5602-4d97-9e70-f95ff9769a79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "04e92ab6-5602-4d97-9e70-f95ff9769a79" (UID: "04e92ab6-5602-4d97-9e70-f95ff9769a79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.261286 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-client-ca" (OuterVolumeSpecName: "client-ca") pod "04e92ab6-5602-4d97-9e70-f95ff9769a79" (UID: "04e92ab6-5602-4d97-9e70-f95ff9769a79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.261565 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.761550634 +0000 UTC m=+106.452119972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.261568 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-config" (OuterVolumeSpecName: "config") pod "04e92ab6-5602-4d97-9e70-f95ff9769a79" (UID: "04e92ab6-5602-4d97-9e70-f95ff9769a79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.262804 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa6a825-72b4-4544-9e19-5af6b2c7648e-kube-api-access-vr4lw" (OuterVolumeSpecName: "kube-api-access-vr4lw") pod "eaa6a825-72b4-4544-9e19-5af6b2c7648e" (UID: "eaa6a825-72b4-4544-9e19-5af6b2c7648e"). InnerVolumeSpecName "kube-api-access-vr4lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.269481 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa6a825-72b4-4544-9e19-5af6b2c7648e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eaa6a825-72b4-4544-9e19-5af6b2c7648e" (UID: "eaa6a825-72b4-4544-9e19-5af6b2c7648e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.274820 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e92ab6-5602-4d97-9e70-f95ff9769a79-kube-api-access-6wxsl" (OuterVolumeSpecName: "kube-api-access-6wxsl") pod "04e92ab6-5602-4d97-9e70-f95ff9769a79" (UID: "04e92ab6-5602-4d97-9e70-f95ff9769a79"). InnerVolumeSpecName "kube-api-access-6wxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.282824 4687 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.300567 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.343445 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8gpq"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357330 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357597 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-proxy-ca-bundles\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357632 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-serving-cert\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357676 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-config\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357746 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-client-ca\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357794 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p59kc\" (UniqueName: \"kubernetes.io/projected/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-kube-api-access-p59kc\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357828 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6qvt4"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357854 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357867 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357876 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa6a825-72b4-4544-9e19-5af6b2c7648e-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357885 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr4lw\" (UniqueName: \"kubernetes.io/projected/eaa6a825-72b4-4544-9e19-5af6b2c7648e-kube-api-access-vr4lw\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357895 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e92ab6-5602-4d97-9e70-f95ff9769a79-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357904 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357911 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaa6a825-72b4-4544-9e19-5af6b2c7648e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357921 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wxsl\" (UniqueName: \"kubernetes.io/projected/04e92ab6-5602-4d97-9e70-f95ff9769a79-kube-api-access-6wxsl\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.357929 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04e92ab6-5602-4d97-9e70-f95ff9769a79-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.358421 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.85840365 +0000 UTC m=+106.548972987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.358769 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.359693 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-proxy-ca-bundles\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.360420 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-client-ca\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.360546 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-config\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.363469 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-serving-cert\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.367771 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qvt4"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.372643 4687 generic.go:334] "Generic (PLEG): container finished" podID="04e92ab6-5602-4d97-9e70-f95ff9769a79" containerID="6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3" exitCode=0 Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.372697 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.372708 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" event={"ID":"04e92ab6-5602-4d97-9e70-f95ff9769a79","Type":"ContainerDied","Data":"6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.372762 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw" event={"ID":"04e92ab6-5602-4d97-9e70-f95ff9769a79","Type":"ContainerDied","Data":"7499533b994e92ae0b8da7ad394eb713d8de2b517c8f68351166a51b6cdd5038"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.372815 4687 scope.go:117] "RemoveContainer" containerID="6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.387639 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p59kc\" (UniqueName: \"kubernetes.io/projected/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-kube-api-access-p59kc\") pod \"controller-manager-6f9f58c4dd-tlnbl\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.412465 4687 generic.go:334] "Generic (PLEG): container finished" podID="e899d87a-f034-4436-8409-ca04178918b7" containerID="a60b310cdec6733ac239fb8523ff17cee642452cef4858be568fedb8075066e1" exitCode=0 Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.412630 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" event={"ID":"e899d87a-f034-4436-8409-ca04178918b7","Type":"ContainerDied","Data":"a60b310cdec6733ac239fb8523ff17cee642452cef4858be568fedb8075066e1"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.416079 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7h597" event={"ID":"8c936fa0-c15d-4c15-b85b-e2e2f1f8fec3","Type":"ContainerStarted","Data":"b1c6b0b09708f159aeacd6da50db9cf3b128d16769f3164c1fce192ba0c35da9"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.421390 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.445169 4687 generic.go:334] "Generic (PLEG): container finished" podID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" containerID="6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b" exitCode=0 Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.445297 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" event={"ID":"eaa6a825-72b4-4544-9e19-5af6b2c7648e","Type":"ContainerDied","Data":"6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.445335 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" event={"ID":"eaa6a825-72b4-4544-9e19-5af6b2c7648e","Type":"ContainerDied","Data":"eec4d7ce43bbec53e53a04ee55088542da07736ba4409ac570aeeb07e55d9886"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.445992 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tx86n" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.457733 4687 generic.go:334] "Generic (PLEG): container finished" podID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerID="8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f" exitCode=0 Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.458376 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svtsw" event={"ID":"9a9c467e-d2ff-4322-bc25-5cfe38dff784","Type":"ContainerDied","Data":"8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.460154 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vnd\" (UniqueName: \"kubernetes.io/projected/998b35fc-9704-4608-94c8-eccb4ca28857-kube-api-access-55vnd\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.460339 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.460391 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-catalog-content\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.460540 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-utilities\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.460909 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svtsw" event={"ID":"9a9c467e-d2ff-4322-bc25-5cfe38dff784","Type":"ContainerStarted","Data":"afb5a525dca570e091fd8a08e97d3c21592dddb59f94b5917f1d7ccb0a3e32da"} Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.460918 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:14.960895595 +0000 UTC m=+106.651464932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.462036 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.468114 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4f5xw"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.484755 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7h597" podStartSLOduration=77.484737853 podStartE2EDuration="1m17.484737853s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:14.483218865 +0000 UTC m=+106.173788201" watchObservedRunningTime="2026-02-28 09:05:14.484737853 +0000 UTC m=+106.175307190" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.484934 4687 scope.go:117] "RemoveContainer" containerID="6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.485541 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3\": container with ID starting with 6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3 not found: ID does not exist" containerID="6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.485573 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3"} err="failed to get container status \"6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3\": rpc error: code = NotFound desc = could not find container \"6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3\": container with ID starting with 6815c81392e78fe2f1ed67021a45475bdcf55c9543991ada7fcff7ba4898c6c3 not found: ID does not exist" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.485635 4687 scope.go:117] "RemoveContainer" containerID="6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.504906 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" event={"ID":"9c633c27-c00d-4436-8b95-c327bcf08a0c","Type":"ContainerStarted","Data":"ff297900730e8875f90554eb06e9a67f1a236289a6bed88a2feaf33219d83bc4"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.515297 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tx86n"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.520259 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tx86n"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.546568 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-26znf" event={"ID":"664a84bd-b59d-4f25-824f-12b593193cd2","Type":"ContainerStarted","Data":"9ac444955886e3b050435ee401c4357ae5b31d23463732b6732b03ef3bbcdc78"} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.548866 4687 patch_prober.go:28] interesting pod/downloads-7954f5f757-8vhfl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.548925 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8vhfl" podUID="4aa07587-0d38-4e29-92ef-c6957b5526a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.550403 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.551634 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" podStartSLOduration=77.551600051 podStartE2EDuration="1m17.551600051s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:14.549868412 +0000 UTC m=+106.240437749" watchObservedRunningTime="2026-02-28 09:05:14.551600051 +0000 UTC m=+106.242169388" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.555064 4687 scope.go:117] "RemoveContainer" containerID="6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.556081 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b\": container with ID starting with 6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b not found: ID does not exist" containerID="6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.556139 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b"} err="failed to get container status \"6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b\": rpc error: code = NotFound desc = could not find container \"6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b\": container with ID starting with 6e973c19e8826a1f009fac80fc4f07882a7ae803718832a7495e81b398a11e0b not found: ID does not exist" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.558471 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.560944 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.561339 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-utilities\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.561380 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55vnd\" (UniqueName: \"kubernetes.io/projected/998b35fc-9704-4608-94c8-eccb4ca28857-kube-api-access-55vnd\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.561462 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-catalog-content\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.562313 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-utilities\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.562430 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-catalog-content\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.562867 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:15.062853704 +0000 UTC m=+106.753423041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.611769 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vnd\" (UniqueName: \"kubernetes.io/projected/998b35fc-9704-4608-94c8-eccb4ca28857-kube-api-access-55vnd\") pod \"redhat-operators-6qvt4\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.643468 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.666449 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.668960 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.674929 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.675185 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.675683 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:15.175426064 +0000 UTC m=+106.865995402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.693996 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e92ab6-5602-4d97-9e70-f95ff9769a79" path="/var/lib/kubelet/pods/04e92ab6-5602-4d97-9e70-f95ff9769a79/volumes" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.694612 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa6a825-72b4-4544-9e19-5af6b2c7648e" path="/var/lib/kubelet/pods/eaa6a825-72b4-4544-9e19-5af6b2c7648e/volumes" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.695038 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.702956 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.752692 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npwxl"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.769586 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.770111 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.770175 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.770233 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 09:05:15.270210578 +0000 UTC m=+106.960779915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.770510 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:14 crc kubenswrapper[4687]: E0228 09:05:14.770920 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 09:05:15.270900406 +0000 UTC m=+106.961469742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-n9h5k" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.773633 4687 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-28T09:05:14.282848887Z","Handler":null,"Name":""} Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.777137 4687 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.777310 4687 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.810062 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:14 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:14 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:14 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.810113 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:14 crc kubenswrapper[4687]: W0228 09:05:14.856086 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69eb70ff_d8c7_4dba_9f8e_1969b7947640.slice/crio-b5eefc022d09438f17d293a881385620a898ee994dc4031e3ed0c27460bf8e06 WatchSource:0}: Error finding container b5eefc022d09438f17d293a881385620a898ee994dc4031e3ed0c27460bf8e06: Status 404 returned error can't find the container with id b5eefc022d09438f17d293a881385620a898ee994dc4031e3ed0c27460bf8e06 Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.880709 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.881116 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.881152 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.881239 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.891954 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.899970 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl"] Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.905738 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.981980 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.988133 4687 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 28 09:05:14 crc kubenswrapper[4687]: I0228 09:05:14.992088 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.013526 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.015933 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-n9h5k\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.036053 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qvt4"] Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.082380 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.360150 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.534636 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9h5k"] Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.588400 4687 generic.go:334] "Generic (PLEG): container finished" podID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerID="381ea14747fff6993b958b645c052d441b57f784bb90a7b1a5439f60c3762659" exitCode=0 Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.588489 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8gpq" event={"ID":"512bb25a-8693-4a78-afcc-77e005a73c0f","Type":"ContainerDied","Data":"381ea14747fff6993b958b645c052d441b57f784bb90a7b1a5439f60c3762659"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.588624 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8gpq" event={"ID":"512bb25a-8693-4a78-afcc-77e005a73c0f","Type":"ContainerStarted","Data":"f0a131154c270182a4e882ee3b2affd3cdcfd109e6cbe48b209129dc2d3c0def"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.606940 4687 generic.go:334] "Generic (PLEG): container finished" podID="998b35fc-9704-4608-94c8-eccb4ca28857" containerID="29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1" exitCode=0 Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.607048 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerDied","Data":"29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.607083 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerStarted","Data":"011a513c919b65446657013d6079e813d248f8214a2adb3500124671c4131c96"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.614243 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" event={"ID":"ea8ade20-0f06-4868-8df4-70d1c2fb40ce","Type":"ContainerStarted","Data":"6f6efd338801af6c83f7194389cb16beca1b98555599f3131211c4316e03c372"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.614289 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" event={"ID":"ea8ade20-0f06-4868-8df4-70d1c2fb40ce","Type":"ContainerStarted","Data":"d3076cd4f3199841ac30c56c8ddfc1c629e748c0e914304904a8a5839dcdea56"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.614688 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.623811 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.624731 4687 generic.go:334] "Generic (PLEG): container finished" podID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerID="6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e" exitCode=0 Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.625260 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerDied","Data":"6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.625310 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerStarted","Data":"b5eefc022d09438f17d293a881385620a898ee994dc4031e3ed0c27460bf8e06"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.651199 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9412124b-67d9-4ef2-aba0-05c04c87ae2a","Type":"ContainerStarted","Data":"11b4a42511b78ad3326d623228cf5610a8f6d10c7afc4a9c0e70da4b37ce27b8"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.657116 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-26znf" event={"ID":"664a84bd-b59d-4f25-824f-12b593193cd2","Type":"ContainerStarted","Data":"500a6fad65b6de596c4a8f38a9e43464bce4323d7e3c4ee9c1aeb90be3dc4a54"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.657216 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-26znf" event={"ID":"664a84bd-b59d-4f25-824f-12b593193cd2","Type":"ContainerStarted","Data":"7897291a8070d52fc6edb90e2897fb81f49e38f29067b95f419826a952e53099"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.659861 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" podStartSLOduration=3.659848905 podStartE2EDuration="3.659848905s" podCreationTimestamp="2026-02-28 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:15.642181113 +0000 UTC m=+107.332750450" watchObservedRunningTime="2026-02-28 09:05:15.659848905 +0000 UTC m=+107.350418252" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.661420 4687 generic.go:334] "Generic (PLEG): container finished" podID="308fcc37-8d88-4077-9f16-e82ab7f6d067" containerID="0840960111b4022dd98b90df070e160cd1ae8cd3fe1364905bdf2f5419d136f3" exitCode=0 Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.661769 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"308fcc37-8d88-4077-9f16-e82ab7f6d067","Type":"ContainerDied","Data":"0840960111b4022dd98b90df070e160cd1ae8cd3fe1364905bdf2f5419d136f3"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.661833 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"308fcc37-8d88-4077-9f16-e82ab7f6d067","Type":"ContainerStarted","Data":"838a8c82a182656eb3e5d3ddd07cea845ccd81e718c9b3d92614249677dfd952"} Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.673693 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-w4s2q" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.691629 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-26znf" podStartSLOduration=10.69161464 podStartE2EDuration="10.69161464s" podCreationTimestamp="2026-02-28 09:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:15.691471872 +0000 UTC m=+107.382041229" watchObservedRunningTime="2026-02-28 09:05:15.69161464 +0000 UTC m=+107.382183977" Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.807899 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:15 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:15 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:15 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:15 crc kubenswrapper[4687]: I0228 09:05:15.807952 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.033894 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.130614 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e899d87a-f034-4436-8409-ca04178918b7-config-volume\") pod \"e899d87a-f034-4436-8409-ca04178918b7\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.131256 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e899d87a-f034-4436-8409-ca04178918b7-config-volume" (OuterVolumeSpecName: "config-volume") pod "e899d87a-f034-4436-8409-ca04178918b7" (UID: "e899d87a-f034-4436-8409-ca04178918b7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.131326 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29j69\" (UniqueName: \"kubernetes.io/projected/e899d87a-f034-4436-8409-ca04178918b7-kube-api-access-29j69\") pod \"e899d87a-f034-4436-8409-ca04178918b7\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.131361 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e899d87a-f034-4436-8409-ca04178918b7-secret-volume\") pod \"e899d87a-f034-4436-8409-ca04178918b7\" (UID: \"e899d87a-f034-4436-8409-ca04178918b7\") " Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.132231 4687 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e899d87a-f034-4436-8409-ca04178918b7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.137286 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e899d87a-f034-4436-8409-ca04178918b7-kube-api-access-29j69" (OuterVolumeSpecName: "kube-api-access-29j69") pod "e899d87a-f034-4436-8409-ca04178918b7" (UID: "e899d87a-f034-4436-8409-ca04178918b7"). InnerVolumeSpecName "kube-api-access-29j69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.138908 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e899d87a-f034-4436-8409-ca04178918b7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e899d87a-f034-4436-8409-ca04178918b7" (UID: "e899d87a-f034-4436-8409-ca04178918b7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.233918 4687 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e899d87a-f034-4436-8409-ca04178918b7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.233953 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29j69\" (UniqueName: \"kubernetes.io/projected/e899d87a-f034-4436-8409-ca04178918b7-kube-api-access-29j69\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.314348 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz"] Feb 28 09:05:16 crc kubenswrapper[4687]: E0228 09:05:16.314684 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e899d87a-f034-4436-8409-ca04178918b7" containerName="collect-profiles" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.314698 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e899d87a-f034-4436-8409-ca04178918b7" containerName="collect-profiles" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.314827 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e899d87a-f034-4436-8409-ca04178918b7" containerName="collect-profiles" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.315182 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.317378 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.317626 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.317843 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.317848 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.318286 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.319828 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.325419 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz"] Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.335313 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-client-ca\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.335387 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-config\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.336944 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-serving-cert\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.336993 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpxbf\" (UniqueName: \"kubernetes.io/projected/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-kube-api-access-mpxbf\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.438070 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-client-ca\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.438168 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-config\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.438204 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-serving-cert\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.438268 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpxbf\" (UniqueName: \"kubernetes.io/projected/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-kube-api-access-mpxbf\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.439551 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-client-ca\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.440433 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-config\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.443985 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-serving-cert\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.452962 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpxbf\" (UniqueName: \"kubernetes.io/projected/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-kube-api-access-mpxbf\") pod \"route-controller-manager-799dfd4db7-szmzz\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.635112 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.676688 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.677646 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9412124b-67d9-4ef2-aba0-05c04c87ae2a","Type":"ContainerStarted","Data":"e5583e4641aba054c4b2d43f8f9754dbdfe5ea5617fa8043c81470d7af900874"} Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.702622 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.702589607 podStartE2EDuration="2.702589607s" podCreationTimestamp="2026-02-28 09:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:16.694651091 +0000 UTC m=+108.385220439" watchObservedRunningTime="2026-02-28 09:05:16.702589607 +0000 UTC m=+108.393158945" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.712176 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.712314 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29" event={"ID":"e899d87a-f034-4436-8409-ca04178918b7","Type":"ContainerDied","Data":"b1e133cb349603baef2789db9f222243fb3ab6750fdf585b440c5145746afc3c"} Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.712364 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e133cb349603baef2789db9f222243fb3ab6750fdf585b440c5145746afc3c" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.717570 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" event={"ID":"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa","Type":"ContainerStarted","Data":"205ec602ca723acc2161f711d005827e53888abe184bc61b1728f6a7baa76c47"} Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.717596 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" event={"ID":"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa","Type":"ContainerStarted","Data":"4809719ca3f67beaecefed918c1d665433039841f891a1596f02691ec35ece92"} Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.718790 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.743558 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" podStartSLOduration=79.743540054 podStartE2EDuration="1m19.743540054s" podCreationTimestamp="2026-02-28 09:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:16.735579206 +0000 UTC m=+108.426148543" watchObservedRunningTime="2026-02-28 09:05:16.743540054 +0000 UTC m=+108.434109391" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.808807 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:16 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:16 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:16 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.808872 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:16 crc kubenswrapper[4687]: I0228 09:05:16.985773 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.047112 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308fcc37-8d88-4077-9f16-e82ab7f6d067-kube-api-access\") pod \"308fcc37-8d88-4077-9f16-e82ab7f6d067\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.047180 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308fcc37-8d88-4077-9f16-e82ab7f6d067-kubelet-dir\") pod \"308fcc37-8d88-4077-9f16-e82ab7f6d067\" (UID: \"308fcc37-8d88-4077-9f16-e82ab7f6d067\") " Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.047272 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/308fcc37-8d88-4077-9f16-e82ab7f6d067-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "308fcc37-8d88-4077-9f16-e82ab7f6d067" (UID: "308fcc37-8d88-4077-9f16-e82ab7f6d067"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.047486 4687 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308fcc37-8d88-4077-9f16-e82ab7f6d067-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.054268 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308fcc37-8d88-4077-9f16-e82ab7f6d067-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "308fcc37-8d88-4077-9f16-e82ab7f6d067" (UID: "308fcc37-8d88-4077-9f16-e82ab7f6d067"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.090802 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz"] Feb 28 09:05:17 crc kubenswrapper[4687]: W0228 09:05:17.137562 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1f6711c_8518_46d6_a6ab_fae4a3e26f6d.slice/crio-5829b2073e4fd8c473df9a8b2efbff496f9e4ac0dba4b81bf06f58fcd112686b WatchSource:0}: Error finding container 5829b2073e4fd8c473df9a8b2efbff496f9e4ac0dba4b81bf06f58fcd112686b: Status 404 returned error can't find the container with id 5829b2073e4fd8c473df9a8b2efbff496f9e4ac0dba4b81bf06f58fcd112686b Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.148363 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/308fcc37-8d88-4077-9f16-e82ab7f6d067-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.727189 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"308fcc37-8d88-4077-9f16-e82ab7f6d067","Type":"ContainerDied","Data":"838a8c82a182656eb3e5d3ddd07cea845ccd81e718c9b3d92614249677dfd952"} Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.727239 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="838a8c82a182656eb3e5d3ddd07cea845ccd81e718c9b3d92614249677dfd952" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.727199 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.729439 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" event={"ID":"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d","Type":"ContainerStarted","Data":"34a89462f946cb6fa3e17cad723603922f3ba59834981d08c61c27b241b40c57"} Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.729481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" event={"ID":"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d","Type":"ContainerStarted","Data":"5829b2073e4fd8c473df9a8b2efbff496f9e4ac0dba4b81bf06f58fcd112686b"} Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.731004 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.732773 4687 generic.go:334] "Generic (PLEG): container finished" podID="9412124b-67d9-4ef2-aba0-05c04c87ae2a" containerID="e5583e4641aba054c4b2d43f8f9754dbdfe5ea5617fa8043c81470d7af900874" exitCode=0 Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.733183 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9412124b-67d9-4ef2-aba0-05c04c87ae2a","Type":"ContainerDied","Data":"e5583e4641aba054c4b2d43f8f9754dbdfe5ea5617fa8043c81470d7af900874"} Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.748851 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" podStartSLOduration=5.748833502 podStartE2EDuration="5.748833502s" podCreationTimestamp="2026-02-28 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:17.746173828 +0000 UTC m=+109.436743165" watchObservedRunningTime="2026-02-28 09:05:17.748833502 +0000 UTC m=+109.439402838" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.766910 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:17 crc kubenswrapper[4687]: E0228 09:05:17.801858 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:17 crc kubenswrapper[4687]: E0228 09:05:17.803666 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:17 crc kubenswrapper[4687]: E0228 09:05:17.806408 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:17 crc kubenswrapper[4687]: E0228 09:05:17.806454 4687 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.807838 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:17 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:17 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:17 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:17 crc kubenswrapper[4687]: I0228 09:05:17.807871 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.371284 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.375290 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bqdqx" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.406613 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.406758 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.423929 4687 patch_prober.go:28] interesting pod/console-f9d7485db-4m8kh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.424007 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4m8kh" podUID="96e679f2-11c5-4ade-abc4-56a7b85a5668" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.683210 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.710711 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.710748 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.717599 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.745690 4687 patch_prober.go:28] interesting pod/downloads-7954f5f757-8vhfl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.745764 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8vhfl" podUID="4aa07587-0d38-4e29-92ef-c6957b5526a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.745691 4687 patch_prober.go:28] interesting pod/downloads-7954f5f757-8vhfl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.745873 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8vhfl" podUID="4aa07587-0d38-4e29-92ef-c6957b5526a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.766500 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-s76rx" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.787704 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.787684096 podStartE2EDuration="787.684096ms" podCreationTimestamp="2026-02-28 09:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:18.769734975 +0000 UTC m=+110.460304322" watchObservedRunningTime="2026-02-28 09:05:18.787684096 +0000 UTC m=+110.478253433" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.808257 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.812862 4687 patch_prober.go:28] interesting pod/router-default-5444994796-zrtwj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 09:05:18 crc kubenswrapper[4687]: [-]has-synced failed: reason withheld Feb 28 09:05:18 crc kubenswrapper[4687]: [+]process-running ok Feb 28 09:05:18 crc kubenswrapper[4687]: healthz check failed Feb 28 09:05:18 crc kubenswrapper[4687]: I0228 09:05:18.812920 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zrtwj" podUID="55b1fe7b-e164-4f79-835b-0cc128a680eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.059648 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.199039 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kube-api-access\") pod \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.199199 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kubelet-dir\") pod \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\" (UID: \"9412124b-67d9-4ef2-aba0-05c04c87ae2a\") " Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.199320 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9412124b-67d9-4ef2-aba0-05c04c87ae2a" (UID: "9412124b-67d9-4ef2-aba0-05c04c87ae2a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.199622 4687 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.224866 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9412124b-67d9-4ef2-aba0-05c04c87ae2a" (UID: "9412124b-67d9-4ef2-aba0-05c04c87ae2a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.299573 4687 ???:1] "http: TLS handshake error from 192.168.126.11:56426: no serving certificate available for the kubelet" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.300479 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9412124b-67d9-4ef2-aba0-05c04c87ae2a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.764184 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9412124b-67d9-4ef2-aba0-05c04c87ae2a","Type":"ContainerDied","Data":"11b4a42511b78ad3326d623228cf5610a8f6d10c7afc4a9c0e70da4b37ce27b8"} Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.764239 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b4a42511b78ad3326d623228cf5610a8f6d10c7afc4a9c0e70da4b37ce27b8" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.764248 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.808649 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:19 crc kubenswrapper[4687]: I0228 09:05:19.810838 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zrtwj" Feb 28 09:05:20 crc kubenswrapper[4687]: I0228 09:05:20.387384 4687 ???:1] "http: TLS handshake error from 192.168.126.11:56436: no serving certificate available for the kubelet" Feb 28 09:05:23 crc kubenswrapper[4687]: I0228 09:05:23.656820 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:05:23 crc kubenswrapper[4687]: I0228 09:05:23.706111 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kvzpk" Feb 28 09:05:24 crc kubenswrapper[4687]: I0228 09:05:24.671295 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 28 09:05:24 crc kubenswrapper[4687]: I0228 09:05:24.803572 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 09:05:24 crc kubenswrapper[4687]: I0228 09:05:24.807043 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9"} Feb 28 09:05:24 crc kubenswrapper[4687]: I0228 09:05:24.807328 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:05:24 crc kubenswrapper[4687]: I0228 09:05:24.825205 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.825182595 podStartE2EDuration="825.182595ms" podCreationTimestamp="2026-02-28 09:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:24.823218911 +0000 UTC m=+116.513788248" watchObservedRunningTime="2026-02-28 09:05:24.825182595 +0000 UTC m=+116.515751932" Feb 28 09:05:24 crc kubenswrapper[4687]: I0228 09:05:24.853726 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.853703967 podStartE2EDuration="24.853703967s" podCreationTimestamp="2026-02-28 09:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:24.850105787 +0000 UTC m=+116.540675125" watchObservedRunningTime="2026-02-28 09:05:24.853703967 +0000 UTC m=+116.544273304" Feb 28 09:05:25 crc kubenswrapper[4687]: I0228 09:05:25.063762 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:05:27 crc kubenswrapper[4687]: E0228 09:05:27.752631 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:27 crc kubenswrapper[4687]: E0228 09:05:27.758831 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:27 crc kubenswrapper[4687]: E0228 09:05:27.761132 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:27 crc kubenswrapper[4687]: E0228 09:05:27.761182 4687 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" Feb 28 09:05:28 crc kubenswrapper[4687]: I0228 09:05:28.412261 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:28 crc kubenswrapper[4687]: I0228 09:05:28.416499 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:05:28 crc kubenswrapper[4687]: I0228 09:05:28.759516 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8vhfl" Feb 28 09:05:29 crc kubenswrapper[4687]: I0228 09:05:29.558853 4687 ???:1] "http: TLS handshake error from 192.168.126.11:56846: no serving certificate available for the kubelet" Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.498596 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl"] Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.499133 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" podUID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" containerName="controller-manager" containerID="cri-o://6f6efd338801af6c83f7194389cb16beca1b98555599f3131211c4316e03c372" gracePeriod=30 Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.500764 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz"] Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.504992 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" podUID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" containerName="route-controller-manager" containerID="cri-o://34a89462f946cb6fa3e17cad723603922f3ba59834981d08c61c27b241b40c57" gracePeriod=30 Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.861422 4687 generic.go:334] "Generic (PLEG): container finished" podID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" containerID="6f6efd338801af6c83f7194389cb16beca1b98555599f3131211c4316e03c372" exitCode=0 Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.861511 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" event={"ID":"ea8ade20-0f06-4868-8df4-70d1c2fb40ce","Type":"ContainerDied","Data":"6f6efd338801af6c83f7194389cb16beca1b98555599f3131211c4316e03c372"} Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.864852 4687 generic.go:334] "Generic (PLEG): container finished" podID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" containerID="34a89462f946cb6fa3e17cad723603922f3ba59834981d08c61c27b241b40c57" exitCode=0 Feb 28 09:05:30 crc kubenswrapper[4687]: I0228 09:05:30.864911 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" event={"ID":"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d","Type":"ContainerDied","Data":"34a89462f946cb6fa3e17cad723603922f3ba59834981d08c61c27b241b40c57"} Feb 28 09:05:34 crc kubenswrapper[4687]: I0228 09:05:34.551636 4687 patch_prober.go:28] interesting pod/controller-manager-6f9f58c4dd-tlnbl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Feb 28 09:05:34 crc kubenswrapper[4687]: I0228 09:05:34.552265 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" podUID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Feb 28 09:05:35 crc kubenswrapper[4687]: I0228 09:05:35.088944 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:05:35 crc kubenswrapper[4687]: I0228 09:05:35.720114 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.132538 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.136549 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.155911 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r"] Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.156172 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" containerName="controller-manager" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156189 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" containerName="controller-manager" Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.156202 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" containerName="route-controller-manager" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156208 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" containerName="route-controller-manager" Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.156220 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9412124b-67d9-4ef2-aba0-05c04c87ae2a" containerName="pruner" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156225 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9412124b-67d9-4ef2-aba0-05c04c87ae2a" containerName="pruner" Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.156234 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308fcc37-8d88-4077-9f16-e82ab7f6d067" containerName="pruner" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156239 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="308fcc37-8d88-4077-9f16-e82ab7f6d067" containerName="pruner" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156332 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9412124b-67d9-4ef2-aba0-05c04c87ae2a" containerName="pruner" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156341 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" containerName="controller-manager" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156348 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="308fcc37-8d88-4077-9f16-e82ab7f6d067" containerName="pruner" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156355 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" containerName="route-controller-manager" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.156641 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.182850 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r"] Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.274657 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-config\") pod \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.274897 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-client-ca\") pod \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.274959 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-serving-cert\") pod \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.274987 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-config\") pod \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275017 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-client-ca\") pod \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275060 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpxbf\" (UniqueName: \"kubernetes.io/projected/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-kube-api-access-mpxbf\") pod \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275092 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-proxy-ca-bundles\") pod \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275137 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-serving-cert\") pod \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\" (UID: \"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275166 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p59kc\" (UniqueName: \"kubernetes.io/projected/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-kube-api-access-p59kc\") pod \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\" (UID: \"ea8ade20-0f06-4868-8df4-70d1c2fb40ce\") " Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275326 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-config\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275355 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adf401dc-8b1a-450f-ab64-0b8d881116d4-serving-cert\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275378 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8rpf\" (UniqueName: \"kubernetes.io/projected/adf401dc-8b1a-450f-ab64-0b8d881116d4-kube-api-access-k8rpf\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.275417 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-client-ca\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.276457 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-config" (OuterVolumeSpecName: "config") pod "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" (UID: "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.277074 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-config" (OuterVolumeSpecName: "config") pod "ea8ade20-0f06-4868-8df4-70d1c2fb40ce" (UID: "ea8ade20-0f06-4868-8df4-70d1c2fb40ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.277101 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" (UID: "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.277426 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea8ade20-0f06-4868-8df4-70d1c2fb40ce" (UID: "ea8ade20-0f06-4868-8df4-70d1c2fb40ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.278190 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ea8ade20-0f06-4868-8df4-70d1c2fb40ce" (UID: "ea8ade20-0f06-4868-8df4-70d1c2fb40ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.281660 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea8ade20-0f06-4868-8df4-70d1c2fb40ce" (UID: "ea8ade20-0f06-4868-8df4-70d1c2fb40ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.281818 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" (UID: "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.283112 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-kube-api-access-p59kc" (OuterVolumeSpecName: "kube-api-access-p59kc") pod "ea8ade20-0f06-4868-8df4-70d1c2fb40ce" (UID: "ea8ade20-0f06-4868-8df4-70d1c2fb40ce"). InnerVolumeSpecName "kube-api-access-p59kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.283840 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-kube-api-access-mpxbf" (OuterVolumeSpecName: "kube-api-access-mpxbf") pod "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" (UID: "b1f6711c-8518-46d6-a6ab-fae4a3e26f6d"). InnerVolumeSpecName "kube-api-access-mpxbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.376883 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-client-ca\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377090 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-config\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377162 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adf401dc-8b1a-450f-ab64-0b8d881116d4-serving-cert\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377194 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8rpf\" (UniqueName: \"kubernetes.io/projected/adf401dc-8b1a-450f-ab64-0b8d881116d4-kube-api-access-k8rpf\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377305 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377317 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p59kc\" (UniqueName: \"kubernetes.io/projected/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-kube-api-access-p59kc\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377328 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377337 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377463 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377745 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377761 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377770 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpxbf\" (UniqueName: \"kubernetes.io/projected/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d-kube-api-access-mpxbf\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.377779 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea8ade20-0f06-4868-8df4-70d1c2fb40ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.379139 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-config\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.384152 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-client-ca\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.384681 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adf401dc-8b1a-450f-ab64-0b8d881116d4-serving-cert\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.396540 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8rpf\" (UniqueName: \"kubernetes.io/projected/adf401dc-8b1a-450f-ab64-0b8d881116d4-kube-api-access-k8rpf\") pod \"route-controller-manager-8566c8f488-9992r\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.511667 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.636401 4687 patch_prober.go:28] interesting pod/route-controller-manager-799dfd4db7-szmzz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: i/o timeout" start-of-body= Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.636709 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" podUID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: i/o timeout" Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.754035 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.756308 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.758755 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 28 09:05:37 crc kubenswrapper[4687]: E0228 09:05:37.758825 4687 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.894576 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r"] Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.916131 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" event={"ID":"adf401dc-8b1a-450f-ab64-0b8d881116d4","Type":"ContainerStarted","Data":"32ffcccdb34c4d13b13447bbf247796d7160f98aff8f328f58d33c484396e161"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.920664 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" event={"ID":"ea8ade20-0f06-4868-8df4-70d1c2fb40ce","Type":"ContainerDied","Data":"d3076cd4f3199841ac30c56c8ddfc1c629e748c0e914304904a8a5839dcdea56"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.920705 4687 scope.go:117] "RemoveContainer" containerID="6f6efd338801af6c83f7194389cb16beca1b98555599f3131211c4316e03c372" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.920798 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.930500 4687 generic.go:334] "Generic (PLEG): container finished" podID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerID="552b78f4321a4aa244e26d4a8c59ac4e2ac5fe20d20cb55abc6478cb2bfe39f7" exitCode=0 Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.930561 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8gpq" event={"ID":"512bb25a-8693-4a78-afcc-77e005a73c0f","Type":"ContainerDied","Data":"552b78f4321a4aa244e26d4a8c59ac4e2ac5fe20d20cb55abc6478cb2bfe39f7"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.935376 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.935800 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz" event={"ID":"b1f6711c-8518-46d6-a6ab-fae4a3e26f6d","Type":"ContainerDied","Data":"5829b2073e4fd8c473df9a8b2efbff496f9e4ac0dba4b81bf06f58fcd112686b"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.937600 4687 generic.go:334] "Generic (PLEG): container finished" podID="556a0190-2912-4b71-a5ae-70c614769f9d" containerID="2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea" exitCode=0 Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.937693 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkgl2" event={"ID":"556a0190-2912-4b71-a5ae-70c614769f9d","Type":"ContainerDied","Data":"2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.948050 4687 generic.go:334] "Generic (PLEG): container finished" podID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerID="d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894" exitCode=0 Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.948193 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svtsw" event={"ID":"9a9c467e-d2ff-4322-bc25-5cfe38dff784","Type":"ContainerDied","Data":"d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.952400 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerStarted","Data":"0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.959143 4687 generic.go:334] "Generic (PLEG): container finished" podID="19def7b9-fb5d-4e49-98db-784814aa9769" containerID="3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f" exitCode=0 Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.959203 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7sr6" event={"ID":"19def7b9-fb5d-4e49-98db-784814aa9769","Type":"ContainerDied","Data":"3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.963310 4687 scope.go:117] "RemoveContainer" containerID="34a89462f946cb6fa3e17cad723603922f3ba59834981d08c61c27b241b40c57" Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.969090 4687 generic.go:334] "Generic (PLEG): container finished" podID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerID="d9de5fc46fef3ee0a0a30cb03de3cf268ecafd55cc468a6ec289f3d271100ff7" exitCode=0 Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.969156 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7l47s" event={"ID":"f193be8c-c2cf-4d79-ac3d-fed262658077","Type":"ContainerDied","Data":"d9de5fc46fef3ee0a0a30cb03de3cf268ecafd55cc468a6ec289f3d271100ff7"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.975921 4687 generic.go:334] "Generic (PLEG): container finished" podID="68c495db-6852-4932-996a-053d7c113f22" containerID="1e314e94326dd925641c2ec84adb71411e9ba3d829a3df3b7f7a97acc9310853" exitCode=0 Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.976251 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67sw" event={"ID":"68c495db-6852-4932-996a-053d7c113f22","Type":"ContainerDied","Data":"1e314e94326dd925641c2ec84adb71411e9ba3d829a3df3b7f7a97acc9310853"} Feb 28 09:05:37 crc kubenswrapper[4687]: I0228 09:05:37.979973 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerStarted","Data":"8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1"} Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.099883 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl"] Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.110338 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f9f58c4dd-tlnbl"] Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.121820 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz"] Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.127875 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-799dfd4db7-szmzz"] Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.666051 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f6711c-8518-46d6-a6ab-fae4a3e26f6d" path="/var/lib/kubelet/pods/b1f6711c-8518-46d6-a6ab-fae4a3e26f6d/volumes" Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.666572 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8ade20-0f06-4868-8df4-70d1c2fb40ce" path="/var/lib/kubelet/pods/ea8ade20-0f06-4868-8df4-70d1c2fb40ce/volumes" Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.989489 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkgl2" event={"ID":"556a0190-2912-4b71-a5ae-70c614769f9d","Type":"ContainerStarted","Data":"5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118"} Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.991404 4687 generic.go:334] "Generic (PLEG): container finished" podID="998b35fc-9704-4608-94c8-eccb4ca28857" containerID="0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab" exitCode=0 Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.991443 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerDied","Data":"0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab"} Feb 28 09:05:38 crc kubenswrapper[4687]: I0228 09:05:38.994125 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7l47s" event={"ID":"f193be8c-c2cf-4d79-ac3d-fed262658077","Type":"ContainerStarted","Data":"2bd81135d7ddffa3488aeba049128355c4f6c7f94a9b0da8256807202a3d81ba"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.000328 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67sw" event={"ID":"68c495db-6852-4932-996a-053d7c113f22","Type":"ContainerStarted","Data":"d663e83d6cf522fb765b47d14de0410716b92051cfd8fb2b7313dfa9c83fe57a"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.008424 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8gpq" event={"ID":"512bb25a-8693-4a78-afcc-77e005a73c0f","Type":"ContainerStarted","Data":"be2f7b43352db1f38cd9996259f6f3a64708d9ce93706c1ee00ca51759565c3e"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.011431 4687 generic.go:334] "Generic (PLEG): container finished" podID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerID="8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1" exitCode=0 Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.011488 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerDied","Data":"8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.011510 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerStarted","Data":"65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.017986 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svtsw" event={"ID":"9a9c467e-d2ff-4322-bc25-5cfe38dff784","Type":"ContainerStarted","Data":"c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.019091 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" event={"ID":"adf401dc-8b1a-450f-ab64-0b8d881116d4","Type":"ContainerStarted","Data":"e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.019175 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.022176 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7sr6" event={"ID":"19def7b9-fb5d-4e49-98db-784814aa9769","Type":"ContainerStarted","Data":"7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd"} Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.026745 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nkgl2" podStartSLOduration=3.682350154 podStartE2EDuration="29.026735332s" podCreationTimestamp="2026-02-28 09:05:10 +0000 UTC" firstStartedPulling="2026-02-28 09:05:13.098196833 +0000 UTC m=+104.788766171" lastFinishedPulling="2026-02-28 09:05:38.442582012 +0000 UTC m=+130.133151349" observedRunningTime="2026-02-28 09:05:39.008165163 +0000 UTC m=+130.698734499" watchObservedRunningTime="2026-02-28 09:05:39.026735332 +0000 UTC m=+130.717304669" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.028287 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.047466 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7l47s" podStartSLOduration=2.65027724 podStartE2EDuration="28.047456388s" podCreationTimestamp="2026-02-28 09:05:11 +0000 UTC" firstStartedPulling="2026-02-28 09:05:13.076999632 +0000 UTC m=+104.767568969" lastFinishedPulling="2026-02-28 09:05:38.47417878 +0000 UTC m=+130.164748117" observedRunningTime="2026-02-28 09:05:39.028051247 +0000 UTC m=+130.718620584" watchObservedRunningTime="2026-02-28 09:05:39.047456388 +0000 UTC m=+130.738025725" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.069446 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k67sw" podStartSLOduration=2.66375085 podStartE2EDuration="28.069436543s" podCreationTimestamp="2026-02-28 09:05:11 +0000 UTC" firstStartedPulling="2026-02-28 09:05:13.091610662 +0000 UTC m=+104.782179999" lastFinishedPulling="2026-02-28 09:05:38.497296354 +0000 UTC m=+130.187865692" observedRunningTime="2026-02-28 09:05:39.067280105 +0000 UTC m=+130.757849442" watchObservedRunningTime="2026-02-28 09:05:39.069436543 +0000 UTC m=+130.760005880" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.115486 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7sr6" podStartSLOduration=3.729431557 podStartE2EDuration="29.115465254s" podCreationTimestamp="2026-02-28 09:05:10 +0000 UTC" firstStartedPulling="2026-02-28 09:05:13.171230944 +0000 UTC m=+104.861800282" lastFinishedPulling="2026-02-28 09:05:38.557264642 +0000 UTC m=+130.247833979" observedRunningTime="2026-02-28 09:05:39.113909086 +0000 UTC m=+130.804478423" watchObservedRunningTime="2026-02-28 09:05:39.115465254 +0000 UTC m=+130.806034591" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.117057 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" podStartSLOduration=9.117048192 podStartE2EDuration="9.117048192s" podCreationTimestamp="2026-02-28 09:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:39.086074877 +0000 UTC m=+130.776644214" watchObservedRunningTime="2026-02-28 09:05:39.117048192 +0000 UTC m=+130.807617529" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.137527 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8gpq" podStartSLOduration=3.31343478 podStartE2EDuration="26.137513046s" podCreationTimestamp="2026-02-28 09:05:13 +0000 UTC" firstStartedPulling="2026-02-28 09:05:15.592437763 +0000 UTC m=+107.283007101" lastFinishedPulling="2026-02-28 09:05:38.41651603 +0000 UTC m=+130.107085367" observedRunningTime="2026-02-28 09:05:39.136876247 +0000 UTC m=+130.827445584" watchObservedRunningTime="2026-02-28 09:05:39.137513046 +0000 UTC m=+130.828082382" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.153199 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npwxl" podStartSLOduration=3.17636574 podStartE2EDuration="26.153186945s" podCreationTimestamp="2026-02-28 09:05:13 +0000 UTC" firstStartedPulling="2026-02-28 09:05:15.638293349 +0000 UTC m=+107.328862686" lastFinishedPulling="2026-02-28 09:05:38.615114554 +0000 UTC m=+130.305683891" observedRunningTime="2026-02-28 09:05:39.15057447 +0000 UTC m=+130.841143807" watchObservedRunningTime="2026-02-28 09:05:39.153186945 +0000 UTC m=+130.843756283" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.333702 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svtsw" podStartSLOduration=3.336424566 podStartE2EDuration="27.333678763s" podCreationTimestamp="2026-02-28 09:05:12 +0000 UTC" firstStartedPulling="2026-02-28 09:05:14.50456153 +0000 UTC m=+106.195130867" lastFinishedPulling="2026-02-28 09:05:38.501815738 +0000 UTC m=+130.192385064" observedRunningTime="2026-02-28 09:05:39.170237945 +0000 UTC m=+130.860807283" watchObservedRunningTime="2026-02-28 09:05:39.333678763 +0000 UTC m=+131.024248139" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.334136 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8448bc778b-8vjlh"] Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.335405 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.338605 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.338893 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.338907 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.339959 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.340340 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.351662 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.356031 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.357380 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8448bc778b-8vjlh"] Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.506301 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-proxy-ca-bundles\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.506574 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f39932a-51e0-4fa3-ad90-84d3f82d129f-serving-cert\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.506613 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-config\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.506631 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-client-ca\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.506680 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phzd\" (UniqueName: \"kubernetes.io/projected/8f39932a-51e0-4fa3-ad90-84d3f82d129f-kube-api-access-8phzd\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.607772 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-config\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.607845 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-client-ca\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.607894 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phzd\" (UniqueName: \"kubernetes.io/projected/8f39932a-51e0-4fa3-ad90-84d3f82d129f-kube-api-access-8phzd\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.607935 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-proxy-ca-bundles\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.607973 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f39932a-51e0-4fa3-ad90-84d3f82d129f-serving-cert\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.609515 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-client-ca\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.609616 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-proxy-ca-bundles\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.609971 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-config\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.614287 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f39932a-51e0-4fa3-ad90-84d3f82d129f-serving-cert\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.627827 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phzd\" (UniqueName: \"kubernetes.io/projected/8f39932a-51e0-4fa3-ad90-84d3f82d129f-kube-api-access-8phzd\") pod \"controller-manager-8448bc778b-8vjlh\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.658941 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:39 crc kubenswrapper[4687]: I0228 09:05:39.882300 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8448bc778b-8vjlh"] Feb 28 09:05:39 crc kubenswrapper[4687]: W0228 09:05:39.884542 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f39932a_51e0_4fa3_ad90_84d3f82d129f.slice/crio-19aaf32554409f11e9f521a3fc9b0340eadafc5c4b8df46297069eb1aa041e0f WatchSource:0}: Error finding container 19aaf32554409f11e9f521a3fc9b0340eadafc5c4b8df46297069eb1aa041e0f: Status 404 returned error can't find the container with id 19aaf32554409f11e9f521a3fc9b0340eadafc5c4b8df46297069eb1aa041e0f Feb 28 09:05:40 crc kubenswrapper[4687]: I0228 09:05:40.032630 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerStarted","Data":"3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7"} Feb 28 09:05:40 crc kubenswrapper[4687]: I0228 09:05:40.034269 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" event={"ID":"8f39932a-51e0-4fa3-ad90-84d3f82d129f","Type":"ContainerStarted","Data":"88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b"} Feb 28 09:05:40 crc kubenswrapper[4687]: I0228 09:05:40.034316 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" event={"ID":"8f39932a-51e0-4fa3-ad90-84d3f82d129f","Type":"ContainerStarted","Data":"19aaf32554409f11e9f521a3fc9b0340eadafc5c4b8df46297069eb1aa041e0f"} Feb 28 09:05:40 crc kubenswrapper[4687]: I0228 09:05:40.058508 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6qvt4" podStartSLOduration=2.203542562 podStartE2EDuration="26.058492109s" podCreationTimestamp="2026-02-28 09:05:14 +0000 UTC" firstStartedPulling="2026-02-28 09:05:15.61238835 +0000 UTC m=+107.302957687" lastFinishedPulling="2026-02-28 09:05:39.467337896 +0000 UTC m=+131.157907234" observedRunningTime="2026-02-28 09:05:40.05596274 +0000 UTC m=+131.746532077" watchObservedRunningTime="2026-02-28 09:05:40.058492109 +0000 UTC m=+131.749061446" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.039501 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.045912 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.060073 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" podStartSLOduration=11.060051781 podStartE2EDuration="11.060051781s" podCreationTimestamp="2026-02-28 09:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:40.089630735 +0000 UTC m=+131.780200071" watchObservedRunningTime="2026-02-28 09:05:41.060051781 +0000 UTC m=+132.750621118" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.273612 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.273668 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.296262 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.296299 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.346546 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.348694 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.684773 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.684861 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.720345 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.720431 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.722122 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:41 crc kubenswrapper[4687]: I0228 09:05:41.760992 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.151314 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.151531 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.179562 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.503658 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jb8xd_3b45242a-b238-4814-b6fa-f22a62c5907f/kube-multus-additional-cni-plugins/0.log" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.503730 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.533466 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.533596 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.564890 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.660350 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist\") pod \"3b45242a-b238-4814-b6fa-f22a62c5907f\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.660441 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b45242a-b238-4814-b6fa-f22a62c5907f-tuning-conf-dir\") pod \"3b45242a-b238-4814-b6fa-f22a62c5907f\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.660486 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3b45242a-b238-4814-b6fa-f22a62c5907f-ready\") pod \"3b45242a-b238-4814-b6fa-f22a62c5907f\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.660515 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sp8t\" (UniqueName: \"kubernetes.io/projected/3b45242a-b238-4814-b6fa-f22a62c5907f-kube-api-access-6sp8t\") pod \"3b45242a-b238-4814-b6fa-f22a62c5907f\" (UID: \"3b45242a-b238-4814-b6fa-f22a62c5907f\") " Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.660609 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b45242a-b238-4814-b6fa-f22a62c5907f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "3b45242a-b238-4814-b6fa-f22a62c5907f" (UID: "3b45242a-b238-4814-b6fa-f22a62c5907f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.660733 4687 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b45242a-b238-4814-b6fa-f22a62c5907f-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.661221 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b45242a-b238-4814-b6fa-f22a62c5907f-ready" (OuterVolumeSpecName: "ready") pod "3b45242a-b238-4814-b6fa-f22a62c5907f" (UID: "3b45242a-b238-4814-b6fa-f22a62c5907f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.661261 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "3b45242a-b238-4814-b6fa-f22a62c5907f" (UID: "3b45242a-b238-4814-b6fa-f22a62c5907f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.668677 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b45242a-b238-4814-b6fa-f22a62c5907f-kube-api-access-6sp8t" (OuterVolumeSpecName: "kube-api-access-6sp8t") pod "3b45242a-b238-4814-b6fa-f22a62c5907f" (UID: "3b45242a-b238-4814-b6fa-f22a62c5907f"). InnerVolumeSpecName "kube-api-access-6sp8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.761632 4687 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b45242a-b238-4814-b6fa-f22a62c5907f-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.761664 4687 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3b45242a-b238-4814-b6fa-f22a62c5907f-ready\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:43 crc kubenswrapper[4687]: I0228 09:05:43.761675 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sp8t\" (UniqueName: \"kubernetes.io/projected/3b45242a-b238-4814-b6fa-f22a62c5907f-kube-api-access-6sp8t\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.054926 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jb8xd_3b45242a-b238-4814-b6fa-f22a62c5907f/kube-multus-additional-cni-plugins/0.log" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.054971 4687 generic.go:334] "Generic (PLEG): container finished" podID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" exitCode=137 Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.055226 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" event={"ID":"3b45242a-b238-4814-b6fa-f22a62c5907f","Type":"ContainerDied","Data":"e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6"} Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.055274 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" event={"ID":"3b45242a-b238-4814-b6fa-f22a62c5907f","Type":"ContainerDied","Data":"6525799c5f3fba6dced15c54186b4f6d7835988f3f759f95dec8e25f4c6cb802"} Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.055296 4687 scope.go:117] "RemoveContainer" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.055418 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb8xd" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.087797 4687 scope.go:117] "RemoveContainer" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" Feb 28 09:05:44 crc kubenswrapper[4687]: E0228 09:05:44.094377 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6\": container with ID starting with e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6 not found: ID does not exist" containerID="e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.094444 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6"} err="failed to get container status \"e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6\": rpc error: code = NotFound desc = could not find container \"e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6\": container with ID starting with e6b6d91b683c4824d9fc5ef34d2ab0bd79f749327d9140168cea7b673cb637a6 not found: ID does not exist" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.108538 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.110292 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb8xd"] Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.117005 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb8xd"] Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.119370 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.301401 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.302068 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.663173 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" path="/var/lib/kubelet/pods/3b45242a-b238-4814-b6fa-f22a62c5907f/volumes" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.703442 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:44 crc kubenswrapper[4687]: I0228 09:05:44.703639 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:45 crc kubenswrapper[4687]: I0228 09:05:45.332709 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-npwxl" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="registry-server" probeResult="failure" output=< Feb 28 09:05:45 crc kubenswrapper[4687]: timeout: failed to connect service ":50051" within 1s Feb 28 09:05:45 crc kubenswrapper[4687]: > Feb 28 09:05:45 crc kubenswrapper[4687]: I0228 09:05:45.734622 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6qvt4" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="registry-server" probeResult="failure" output=< Feb 28 09:05:45 crc kubenswrapper[4687]: timeout: failed to connect service ":50051" within 1s Feb 28 09:05:45 crc kubenswrapper[4687]: > Feb 28 09:05:47 crc kubenswrapper[4687]: I0228 09:05:47.767032 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8gpq"] Feb 28 09:05:47 crc kubenswrapper[4687]: I0228 09:05:47.767612 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8gpq" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="registry-server" containerID="cri-o://be2f7b43352db1f38cd9996259f6f3a64708d9ce93706c1ee00ca51759565c3e" gracePeriod=2 Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.078134 4687 generic.go:334] "Generic (PLEG): container finished" podID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerID="be2f7b43352db1f38cd9996259f6f3a64708d9ce93706c1ee00ca51759565c3e" exitCode=0 Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.078200 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8gpq" event={"ID":"512bb25a-8693-4a78-afcc-77e005a73c0f","Type":"ContainerDied","Data":"be2f7b43352db1f38cd9996259f6f3a64708d9ce93706c1ee00ca51759565c3e"} Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.148768 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.321230 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-utilities\") pod \"512bb25a-8693-4a78-afcc-77e005a73c0f\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.321300 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v79xk\" (UniqueName: \"kubernetes.io/projected/512bb25a-8693-4a78-afcc-77e005a73c0f-kube-api-access-v79xk\") pod \"512bb25a-8693-4a78-afcc-77e005a73c0f\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.321349 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-catalog-content\") pod \"512bb25a-8693-4a78-afcc-77e005a73c0f\" (UID: \"512bb25a-8693-4a78-afcc-77e005a73c0f\") " Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.321974 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-utilities" (OuterVolumeSpecName: "utilities") pod "512bb25a-8693-4a78-afcc-77e005a73c0f" (UID: "512bb25a-8693-4a78-afcc-77e005a73c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.327310 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512bb25a-8693-4a78-afcc-77e005a73c0f-kube-api-access-v79xk" (OuterVolumeSpecName: "kube-api-access-v79xk") pod "512bb25a-8693-4a78-afcc-77e005a73c0f" (UID: "512bb25a-8693-4a78-afcc-77e005a73c0f"). InnerVolumeSpecName "kube-api-access-v79xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.341842 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "512bb25a-8693-4a78-afcc-77e005a73c0f" (UID: "512bb25a-8693-4a78-afcc-77e005a73c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.423129 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.423157 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v79xk\" (UniqueName: \"kubernetes.io/projected/512bb25a-8693-4a78-afcc-77e005a73c0f-kube-api-access-v79xk\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.423168 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512bb25a-8693-4a78-afcc-77e005a73c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:48 crc kubenswrapper[4687]: I0228 09:05:48.831286 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mhzwl" Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.085403 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8gpq" event={"ID":"512bb25a-8693-4a78-afcc-77e005a73c0f","Type":"ContainerDied","Data":"f0a131154c270182a4e882ee3b2affd3cdcfd109e6cbe48b209129dc2d3c0def"} Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.085481 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8gpq" Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.085480 4687 scope.go:117] "RemoveContainer" containerID="be2f7b43352db1f38cd9996259f6f3a64708d9ce93706c1ee00ca51759565c3e" Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.102103 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8gpq"] Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.105054 4687 scope.go:117] "RemoveContainer" containerID="552b78f4321a4aa244e26d4a8c59ac4e2ac5fe20d20cb55abc6478cb2bfe39f7" Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.106918 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8gpq"] Feb 28 09:05:49 crc kubenswrapper[4687]: I0228 09:05:49.116241 4687 scope.go:117] "RemoveContainer" containerID="381ea14747fff6993b958b645c052d441b57f784bb90a7b1a5439f60c3762659" Feb 28 09:05:50 crc kubenswrapper[4687]: I0228 09:05:50.494607 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8448bc778b-8vjlh"] Feb 28 09:05:50 crc kubenswrapper[4687]: I0228 09:05:50.494803 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" podUID="8f39932a-51e0-4fa3-ad90-84d3f82d129f" containerName="controller-manager" containerID="cri-o://88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b" gracePeriod=30 Feb 28 09:05:50 crc kubenswrapper[4687]: I0228 09:05:50.589760 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 09:05:50 crc kubenswrapper[4687]: I0228 09:05:50.605077 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r"] Feb 28 09:05:50 crc kubenswrapper[4687]: I0228 09:05:50.605397 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" podUID="adf401dc-8b1a-450f-ab64-0b8d881116d4" containerName="route-controller-manager" containerID="cri-o://e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb" gracePeriod=30 Feb 28 09:05:50 crc kubenswrapper[4687]: I0228 09:05:50.662951 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" path="/var/lib/kubelet/pods/512bb25a-8693-4a78-afcc-77e005a73c0f/volumes" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.008179 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.011169 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059057 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f39932a-51e0-4fa3-ad90-84d3f82d129f-serving-cert\") pod \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059155 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-proxy-ca-bundles\") pod \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059202 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-config\") pod \"adf401dc-8b1a-450f-ab64-0b8d881116d4\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059227 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phzd\" (UniqueName: \"kubernetes.io/projected/8f39932a-51e0-4fa3-ad90-84d3f82d129f-kube-api-access-8phzd\") pod \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059256 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-config\") pod \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059283 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-client-ca\") pod \"adf401dc-8b1a-450f-ab64-0b8d881116d4\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059313 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8rpf\" (UniqueName: \"kubernetes.io/projected/adf401dc-8b1a-450f-ab64-0b8d881116d4-kube-api-access-k8rpf\") pod \"adf401dc-8b1a-450f-ab64-0b8d881116d4\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059335 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adf401dc-8b1a-450f-ab64-0b8d881116d4-serving-cert\") pod \"adf401dc-8b1a-450f-ab64-0b8d881116d4\" (UID: \"adf401dc-8b1a-450f-ab64-0b8d881116d4\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059385 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-client-ca\") pod \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\" (UID: \"8f39932a-51e0-4fa3-ad90-84d3f82d129f\") " Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059735 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8f39932a-51e0-4fa3-ad90-84d3f82d129f" (UID: "8f39932a-51e0-4fa3-ad90-84d3f82d129f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059929 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "adf401dc-8b1a-450f-ab64-0b8d881116d4" (UID: "adf401dc-8b1a-450f-ab64-0b8d881116d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.059998 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f39932a-51e0-4fa3-ad90-84d3f82d129f" (UID: "8f39932a-51e0-4fa3-ad90-84d3f82d129f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.060045 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-config" (OuterVolumeSpecName: "config") pod "adf401dc-8b1a-450f-ab64-0b8d881116d4" (UID: "adf401dc-8b1a-450f-ab64-0b8d881116d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.061129 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-config" (OuterVolumeSpecName: "config") pod "8f39932a-51e0-4fa3-ad90-84d3f82d129f" (UID: "8f39932a-51e0-4fa3-ad90-84d3f82d129f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.063945 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f39932a-51e0-4fa3-ad90-84d3f82d129f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f39932a-51e0-4fa3-ad90-84d3f82d129f" (UID: "8f39932a-51e0-4fa3-ad90-84d3f82d129f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.063977 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf401dc-8b1a-450f-ab64-0b8d881116d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "adf401dc-8b1a-450f-ab64-0b8d881116d4" (UID: "adf401dc-8b1a-450f-ab64-0b8d881116d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.064007 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf401dc-8b1a-450f-ab64-0b8d881116d4-kube-api-access-k8rpf" (OuterVolumeSpecName: "kube-api-access-k8rpf") pod "adf401dc-8b1a-450f-ab64-0b8d881116d4" (UID: "adf401dc-8b1a-450f-ab64-0b8d881116d4"). InnerVolumeSpecName "kube-api-access-k8rpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.065418 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f39932a-51e0-4fa3-ad90-84d3f82d129f-kube-api-access-8phzd" (OuterVolumeSpecName: "kube-api-access-8phzd") pod "8f39932a-51e0-4fa3-ad90-84d3f82d129f" (UID: "8f39932a-51e0-4fa3-ad90-84d3f82d129f"). InnerVolumeSpecName "kube-api-access-8phzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.101900 4687 generic.go:334] "Generic (PLEG): container finished" podID="adf401dc-8b1a-450f-ab64-0b8d881116d4" containerID="e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb" exitCode=0 Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.101950 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.101982 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" event={"ID":"adf401dc-8b1a-450f-ab64-0b8d881116d4","Type":"ContainerDied","Data":"e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb"} Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.102042 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r" event={"ID":"adf401dc-8b1a-450f-ab64-0b8d881116d4","Type":"ContainerDied","Data":"32ffcccdb34c4d13b13447bbf247796d7160f98aff8f328f58d33c484396e161"} Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.102067 4687 scope.go:117] "RemoveContainer" containerID="e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.104080 4687 generic.go:334] "Generic (PLEG): container finished" podID="8f39932a-51e0-4fa3-ad90-84d3f82d129f" containerID="88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b" exitCode=0 Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.104132 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" event={"ID":"8f39932a-51e0-4fa3-ad90-84d3f82d129f","Type":"ContainerDied","Data":"88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b"} Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.104150 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" event={"ID":"8f39932a-51e0-4fa3-ad90-84d3f82d129f","Type":"ContainerDied","Data":"19aaf32554409f11e9f521a3fc9b0340eadafc5c4b8df46297069eb1aa041e0f"} Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.104214 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8448bc778b-8vjlh" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.114718 4687 scope.go:117] "RemoveContainer" containerID="e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.115077 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb\": container with ID starting with e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb not found: ID does not exist" containerID="e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.115106 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb"} err="failed to get container status \"e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb\": rpc error: code = NotFound desc = could not find container \"e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb\": container with ID starting with e253c816abb307ec7c342605ad41dcefcad81d7a30eb6047ccf93190bbaf45cb not found: ID does not exist" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.115126 4687 scope.go:117] "RemoveContainer" containerID="88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.122932 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r"] Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.125384 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8566c8f488-9992r"] Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.132746 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8448bc778b-8vjlh"] Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.132908 4687 scope.go:117] "RemoveContainer" containerID="88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.133474 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b\": container with ID starting with 88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b not found: ID does not exist" containerID="88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.133524 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b"} err="failed to get container status \"88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b\": rpc error: code = NotFound desc = could not find container \"88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b\": container with ID starting with 88632df7131420b704135b33a8c4032604375fc6cc3dd53382fa94ee1f91631b not found: ID does not exist" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.136064 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8448bc778b-8vjlh"] Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160437 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8rpf\" (UniqueName: \"kubernetes.io/projected/adf401dc-8b1a-450f-ab64-0b8d881116d4-kube-api-access-k8rpf\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160460 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adf401dc-8b1a-450f-ab64-0b8d881116d4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160473 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160481 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f39932a-51e0-4fa3-ad90-84d3f82d129f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160490 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160499 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160507 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phzd\" (UniqueName: \"kubernetes.io/projected/8f39932a-51e0-4fa3-ad90-84d3f82d129f-kube-api-access-8phzd\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160514 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f39932a-51e0-4fa3-ad90-84d3f82d129f-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.160522 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adf401dc-8b1a-450f-ab64-0b8d881116d4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232151 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.232389 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="registry-server" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232402 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="registry-server" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.232417 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f39932a-51e0-4fa3-ad90-84d3f82d129f" containerName="controller-manager" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232424 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f39932a-51e0-4fa3-ad90-84d3f82d129f" containerName="controller-manager" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.232434 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="extract-utilities" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232440 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="extract-utilities" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.232448 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf401dc-8b1a-450f-ab64-0b8d881116d4" containerName="route-controller-manager" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232453 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf401dc-8b1a-450f-ab64-0b8d881116d4" containerName="route-controller-manager" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.232462 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232470 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" Feb 28 09:05:51 crc kubenswrapper[4687]: E0228 09:05:51.232481 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="extract-content" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232487 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="extract-content" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232585 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f39932a-51e0-4fa3-ad90-84d3f82d129f" containerName="controller-manager" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232596 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf401dc-8b1a-450f-ab64-0b8d881116d4" containerName="route-controller-manager" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232603 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b45242a-b238-4814-b6fa-f22a62c5907f" containerName="kube-multus-additional-cni-plugins" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.232610 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="512bb25a-8693-4a78-afcc-77e005a73c0f" containerName="registry-server" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.233006 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.234736 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.235343 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.241786 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.261360 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f474551-62a6-45fd-ade5-91f6f7c27b87-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.261403 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f474551-62a6-45fd-ade5-91f6f7c27b87-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.306871 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.333399 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.361988 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f474551-62a6-45fd-ade5-91f6f7c27b87-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.362060 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f474551-62a6-45fd-ade5-91f6f7c27b87-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.362066 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f474551-62a6-45fd-ade5-91f6f7c27b87-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.375313 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f474551-62a6-45fd-ade5-91f6f7c27b87-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.548875 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.713547 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.756660 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:51 crc kubenswrapper[4687]: I0228 09:05:51.912112 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 09:05:51 crc kubenswrapper[4687]: W0228 09:05:51.915811 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9f474551_62a6_45fd_ade5_91f6f7c27b87.slice/crio-977e3a48e2e7295c76f7717d85f45ccddafb60749ec5c3dc962db98941a71a2f WatchSource:0}: Error finding container 977e3a48e2e7295c76f7717d85f45ccddafb60749ec5c3dc962db98941a71a2f: Status 404 returned error can't find the container with id 977e3a48e2e7295c76f7717d85f45ccddafb60749ec5c3dc962db98941a71a2f Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.117084 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9f474551-62a6-45fd-ade5-91f6f7c27b87","Type":"ContainerStarted","Data":"b1aac7b5b695983e018b26a632eb5f4c839c08a927eb5f1413bf87a06f2867d2"} Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.117382 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9f474551-62a6-45fd-ade5-91f6f7c27b87","Type":"ContainerStarted","Data":"977e3a48e2e7295c76f7717d85f45ccddafb60749ec5c3dc962db98941a71a2f"} Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.130118 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.1300835710000001 podStartE2EDuration="1.130083571s" podCreationTimestamp="2026-02-28 09:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:52.128188256 +0000 UTC m=+143.818757593" watchObservedRunningTime="2026-02-28 09:05:52.130083571 +0000 UTC m=+143.820652907" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.338333 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f"] Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.339369 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.341196 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.341531 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.342094 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.342338 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.342360 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg"] Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.343526 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.343605 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.343940 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.346661 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.347088 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.347361 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.347734 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.348067 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.348366 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.358828 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.360983 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f"] Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.364336 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg"] Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477676 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-config\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477726 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-config\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477808 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-client-ca\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477843 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkxc\" (UniqueName: \"kubernetes.io/projected/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-kube-api-access-vmkxc\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477911 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-proxy-ca-bundles\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477947 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-serving-cert\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477971 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvgv\" (UniqueName: \"kubernetes.io/projected/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-kube-api-access-pnvgv\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.477989 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-serving-cert\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.478116 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-client-ca\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.578875 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-client-ca\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.578921 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkxc\" (UniqueName: \"kubernetes.io/projected/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-kube-api-access-vmkxc\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.578952 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-proxy-ca-bundles\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.578972 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-serving-cert\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.579002 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvgv\" (UniqueName: \"kubernetes.io/projected/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-kube-api-access-pnvgv\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.579043 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-serving-cert\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.579081 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-client-ca\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.579111 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-config\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.579135 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-config\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.580695 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-client-ca\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.580769 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-config\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.580862 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-client-ca\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.581107 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-proxy-ca-bundles\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.581216 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-config\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.586662 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-serving-cert\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.591838 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-serving-cert\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.594617 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvgv\" (UniqueName: \"kubernetes.io/projected/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-kube-api-access-pnvgv\") pod \"controller-manager-ffc9d9c9c-tztvg\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.597640 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkxc\" (UniqueName: \"kubernetes.io/projected/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-kube-api-access-vmkxc\") pod \"route-controller-manager-676894c655-2fj8f\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.655155 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.662558 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f39932a-51e0-4fa3-ad90-84d3f82d129f" path="/var/lib/kubelet/pods/8f39932a-51e0-4fa3-ad90-84d3f82d129f/volumes" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.663540 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf401dc-8b1a-450f-ab64-0b8d881116d4" path="/var/lib/kubelet/pods/adf401dc-8b1a-450f-ab64-0b8d881116d4/volumes" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.663701 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.965748 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k67sw"] Feb 28 09:05:52 crc kubenswrapper[4687]: I0228 09:05:52.966304 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k67sw" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="registry-server" containerID="cri-o://d663e83d6cf522fb765b47d14de0410716b92051cfd8fb2b7313dfa9c83fe57a" gracePeriod=2 Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.033443 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f"] Feb 28 09:05:53 crc kubenswrapper[4687]: W0228 09:05:53.057673 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe6f8df_8219_45e4_a02b_36b7ad63cf61.slice/crio-9c898bc47d1dde220e0e3d688771c9bf4d84efe9694fe1131a8ddf08d4db52f9 WatchSource:0}: Error finding container 9c898bc47d1dde220e0e3d688771c9bf4d84efe9694fe1131a8ddf08d4db52f9: Status 404 returned error can't find the container with id 9c898bc47d1dde220e0e3d688771c9bf4d84efe9694fe1131a8ddf08d4db52f9 Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.088447 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg"] Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.127088 4687 generic.go:334] "Generic (PLEG): container finished" podID="68c495db-6852-4932-996a-053d7c113f22" containerID="d663e83d6cf522fb765b47d14de0410716b92051cfd8fb2b7313dfa9c83fe57a" exitCode=0 Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.127218 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67sw" event={"ID":"68c495db-6852-4932-996a-053d7c113f22","Type":"ContainerDied","Data":"d663e83d6cf522fb765b47d14de0410716b92051cfd8fb2b7313dfa9c83fe57a"} Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.135013 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" event={"ID":"97fbd2d6-e31b-4c77-9821-99ee5fc4866e","Type":"ContainerStarted","Data":"41a902b5c1344142597b418a71e9f72d7b78adb670b773c7a98fd13e68427b3a"} Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.137884 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" event={"ID":"6fe6f8df-8219-45e4-a02b-36b7ad63cf61","Type":"ContainerStarted","Data":"9c898bc47d1dde220e0e3d688771c9bf4d84efe9694fe1131a8ddf08d4db52f9"} Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.139450 4687 generic.go:334] "Generic (PLEG): container finished" podID="9f474551-62a6-45fd-ade5-91f6f7c27b87" containerID="b1aac7b5b695983e018b26a632eb5f4c839c08a927eb5f1413bf87a06f2867d2" exitCode=0 Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.139484 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9f474551-62a6-45fd-ade5-91f6f7c27b87","Type":"ContainerDied","Data":"b1aac7b5b695983e018b26a632eb5f4c839c08a927eb5f1413bf87a06f2867d2"} Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.293919 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.490686 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-utilities\") pod \"68c495db-6852-4932-996a-053d7c113f22\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.490792 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-catalog-content\") pod \"68c495db-6852-4932-996a-053d7c113f22\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.490869 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkgrq\" (UniqueName: \"kubernetes.io/projected/68c495db-6852-4932-996a-053d7c113f22-kube-api-access-zkgrq\") pod \"68c495db-6852-4932-996a-053d7c113f22\" (UID: \"68c495db-6852-4932-996a-053d7c113f22\") " Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.491555 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-utilities" (OuterVolumeSpecName: "utilities") pod "68c495db-6852-4932-996a-053d7c113f22" (UID: "68c495db-6852-4932-996a-053d7c113f22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.498399 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c495db-6852-4932-996a-053d7c113f22-kube-api-access-zkgrq" (OuterVolumeSpecName: "kube-api-access-zkgrq") pod "68c495db-6852-4932-996a-053d7c113f22" (UID: "68c495db-6852-4932-996a-053d7c113f22"). InnerVolumeSpecName "kube-api-access-zkgrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.540443 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68c495db-6852-4932-996a-053d7c113f22" (UID: "68c495db-6852-4932-996a-053d7c113f22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.592698 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.592755 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkgrq\" (UniqueName: \"kubernetes.io/projected/68c495db-6852-4932-996a-053d7c113f22-kube-api-access-zkgrq\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.592776 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c495db-6852-4932-996a-053d7c113f22-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.968219 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7l47s"] Feb 28 09:05:53 crc kubenswrapper[4687]: I0228 09:05:53.968720 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7l47s" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="registry-server" containerID="cri-o://2bd81135d7ddffa3488aeba049128355c4f6c7f94a9b0da8256807202a3d81ba" gracePeriod=2 Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.147785 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67sw" event={"ID":"68c495db-6852-4932-996a-053d7c113f22","Type":"ContainerDied","Data":"c004fca93b67d0e542a5baf69a27021c115b2aad4c1022c9267620e18e2e89ab"} Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.147868 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67sw" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.147921 4687 scope.go:117] "RemoveContainer" containerID="d663e83d6cf522fb765b47d14de0410716b92051cfd8fb2b7313dfa9c83fe57a" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.149536 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" event={"ID":"97fbd2d6-e31b-4c77-9821-99ee5fc4866e","Type":"ContainerStarted","Data":"d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d"} Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.149756 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.152436 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" event={"ID":"6fe6f8df-8219-45e4-a02b-36b7ad63cf61","Type":"ContainerStarted","Data":"f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949"} Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.152915 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.156065 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.159349 4687 generic.go:334] "Generic (PLEG): container finished" podID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerID="2bd81135d7ddffa3488aeba049128355c4f6c7f94a9b0da8256807202a3d81ba" exitCode=0 Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.159492 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7l47s" event={"ID":"f193be8c-c2cf-4d79-ac3d-fed262658077","Type":"ContainerDied","Data":"2bd81135d7ddffa3488aeba049128355c4f6c7f94a9b0da8256807202a3d81ba"} Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.159566 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.167211 4687 scope.go:117] "RemoveContainer" containerID="1e314e94326dd925641c2ec84adb71411e9ba3d829a3df3b7f7a97acc9310853" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.176691 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" podStartSLOduration=4.176658879 podStartE2EDuration="4.176658879s" podCreationTimestamp="2026-02-28 09:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:54.174992835 +0000 UTC m=+145.865562182" watchObservedRunningTime="2026-02-28 09:05:54.176658879 +0000 UTC m=+145.867228216" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.197115 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" podStartSLOduration=4.197098765 podStartE2EDuration="4.197098765s" podCreationTimestamp="2026-02-28 09:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:05:54.195875013 +0000 UTC m=+145.886444350" watchObservedRunningTime="2026-02-28 09:05:54.197098765 +0000 UTC m=+145.887668102" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.198753 4687 scope.go:117] "RemoveContainer" containerID="ee11752ea57cff8fbbf450d7a5e8036af1da6ca5769933a005dee76de0968284" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.209236 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k67sw"] Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.210743 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k67sw"] Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.338669 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.375760 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.387203 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.404335 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-utilities\") pod \"f193be8c-c2cf-4d79-ac3d-fed262658077\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.404442 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mjhz\" (UniqueName: \"kubernetes.io/projected/f193be8c-c2cf-4d79-ac3d-fed262658077-kube-api-access-4mjhz\") pod \"f193be8c-c2cf-4d79-ac3d-fed262658077\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.407097 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-utilities" (OuterVolumeSpecName: "utilities") pod "f193be8c-c2cf-4d79-ac3d-fed262658077" (UID: "f193be8c-c2cf-4d79-ac3d-fed262658077"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.417352 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f193be8c-c2cf-4d79-ac3d-fed262658077-kube-api-access-4mjhz" (OuterVolumeSpecName: "kube-api-access-4mjhz") pod "f193be8c-c2cf-4d79-ac3d-fed262658077" (UID: "f193be8c-c2cf-4d79-ac3d-fed262658077"). InnerVolumeSpecName "kube-api-access-4mjhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.433665 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.505274 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-catalog-content\") pod \"f193be8c-c2cf-4d79-ac3d-fed262658077\" (UID: \"f193be8c-c2cf-4d79-ac3d-fed262658077\") " Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.505671 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mjhz\" (UniqueName: \"kubernetes.io/projected/f193be8c-c2cf-4d79-ac3d-fed262658077-kube-api-access-4mjhz\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.505693 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.543067 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f193be8c-c2cf-4d79-ac3d-fed262658077" (UID: "f193be8c-c2cf-4d79-ac3d-fed262658077"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.606524 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f474551-62a6-45fd-ade5-91f6f7c27b87-kube-api-access\") pod \"9f474551-62a6-45fd-ade5-91f6f7c27b87\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.606603 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f474551-62a6-45fd-ade5-91f6f7c27b87-kubelet-dir\") pod \"9f474551-62a6-45fd-ade5-91f6f7c27b87\" (UID: \"9f474551-62a6-45fd-ade5-91f6f7c27b87\") " Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.606825 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f474551-62a6-45fd-ade5-91f6f7c27b87-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9f474551-62a6-45fd-ade5-91f6f7c27b87" (UID: "9f474551-62a6-45fd-ade5-91f6f7c27b87"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.607010 4687 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f474551-62a6-45fd-ade5-91f6f7c27b87-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.607049 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f193be8c-c2cf-4d79-ac3d-fed262658077-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.610555 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f474551-62a6-45fd-ade5-91f6f7c27b87-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9f474551-62a6-45fd-ade5-91f6f7c27b87" (UID: "9f474551-62a6-45fd-ade5-91f6f7c27b87"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.663269 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c495db-6852-4932-996a-053d7c113f22" path="/var/lib/kubelet/pods/68c495db-6852-4932-996a-053d7c113f22/volumes" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.708209 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f474551-62a6-45fd-ade5-91f6f7c27b87-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.741640 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:54 crc kubenswrapper[4687]: I0228 09:05:54.774710 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.165198 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9f474551-62a6-45fd-ade5-91f6f7c27b87","Type":"ContainerDied","Data":"977e3a48e2e7295c76f7717d85f45ccddafb60749ec5c3dc962db98941a71a2f"} Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.165236 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977e3a48e2e7295c76f7717d85f45ccddafb60749ec5c3dc962db98941a71a2f" Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.165307 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.168703 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7l47s" Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.168803 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7l47s" event={"ID":"f193be8c-c2cf-4d79-ac3d-fed262658077","Type":"ContainerDied","Data":"430421ce6c3f665f88b6cc3c6ee3c75759e5a7496f43f54297ad5fb7eb5f1192"} Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.169006 4687 scope.go:117] "RemoveContainer" containerID="2bd81135d7ddffa3488aeba049128355c4f6c7f94a9b0da8256807202a3d81ba" Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.181952 4687 scope.go:117] "RemoveContainer" containerID="d9de5fc46fef3ee0a0a30cb03de3cf268ecafd55cc468a6ec289f3d271100ff7" Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.182246 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7l47s"] Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.191328 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7l47s"] Feb 28 09:05:55 crc kubenswrapper[4687]: I0228 09:05:55.200710 4687 scope.go:117] "RemoveContainer" containerID="36cee0dd39e5c0666fee9deec291840ff3de8af7e94270b6a64014c49c8eda58" Feb 28 09:05:56 crc kubenswrapper[4687]: I0228 09:05:56.663770 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" path="/var/lib/kubelet/pods/f193be8c-c2cf-4d79-ac3d-fed262658077/volumes" Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.365636 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qvt4"] Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.365922 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6qvt4" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="registry-server" containerID="cri-o://3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7" gracePeriod=2 Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.766683 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.943976 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-utilities\") pod \"998b35fc-9704-4608-94c8-eccb4ca28857\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.944106 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55vnd\" (UniqueName: \"kubernetes.io/projected/998b35fc-9704-4608-94c8-eccb4ca28857-kube-api-access-55vnd\") pod \"998b35fc-9704-4608-94c8-eccb4ca28857\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.944154 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-catalog-content\") pod \"998b35fc-9704-4608-94c8-eccb4ca28857\" (UID: \"998b35fc-9704-4608-94c8-eccb4ca28857\") " Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.945248 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-utilities" (OuterVolumeSpecName: "utilities") pod "998b35fc-9704-4608-94c8-eccb4ca28857" (UID: "998b35fc-9704-4608-94c8-eccb4ca28857"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:57 crc kubenswrapper[4687]: I0228 09:05:57.950292 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998b35fc-9704-4608-94c8-eccb4ca28857-kube-api-access-55vnd" (OuterVolumeSpecName: "kube-api-access-55vnd") pod "998b35fc-9704-4608-94c8-eccb4ca28857" (UID: "998b35fc-9704-4608-94c8-eccb4ca28857"). InnerVolumeSpecName "kube-api-access-55vnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.040045 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "998b35fc-9704-4608-94c8-eccb4ca28857" (UID: "998b35fc-9704-4608-94c8-eccb4ca28857"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.045570 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.045597 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55vnd\" (UniqueName: \"kubernetes.io/projected/998b35fc-9704-4608-94c8-eccb4ca28857-kube-api-access-55vnd\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.045608 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998b35fc-9704-4608-94c8-eccb4ca28857-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.189887 4687 generic.go:334] "Generic (PLEG): container finished" podID="998b35fc-9704-4608-94c8-eccb4ca28857" containerID="3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7" exitCode=0 Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.189942 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerDied","Data":"3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7"} Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.189976 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qvt4" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.189996 4687 scope.go:117] "RemoveContainer" containerID="3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.189983 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qvt4" event={"ID":"998b35fc-9704-4608-94c8-eccb4ca28857","Type":"ContainerDied","Data":"011a513c919b65446657013d6079e813d248f8214a2adb3500124671c4131c96"} Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.213770 4687 scope.go:117] "RemoveContainer" containerID="0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.213888 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qvt4"] Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.216220 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6qvt4"] Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.236715 4687 scope.go:117] "RemoveContainer" containerID="29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.250872 4687 scope.go:117] "RemoveContainer" containerID="3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7" Feb 28 09:05:58 crc kubenswrapper[4687]: E0228 09:05:58.251200 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7\": container with ID starting with 3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7 not found: ID does not exist" containerID="3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.251237 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7"} err="failed to get container status \"3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7\": rpc error: code = NotFound desc = could not find container \"3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7\": container with ID starting with 3c8c0bbd4f3cfedae4216f03f8f79fdb701c2093a136c8b430ee730b5f921fe7 not found: ID does not exist" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.251262 4687 scope.go:117] "RemoveContainer" containerID="0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab" Feb 28 09:05:58 crc kubenswrapper[4687]: E0228 09:05:58.251580 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab\": container with ID starting with 0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab not found: ID does not exist" containerID="0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.251611 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab"} err="failed to get container status \"0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab\": rpc error: code = NotFound desc = could not find container \"0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab\": container with ID starting with 0945a46efecb298e6d2daf2027d300d4d886e1f024eced027f6c311f673016ab not found: ID does not exist" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.251630 4687 scope.go:117] "RemoveContainer" containerID="29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1" Feb 28 09:05:58 crc kubenswrapper[4687]: E0228 09:05:58.252038 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1\": container with ID starting with 29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1 not found: ID does not exist" containerID="29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.252079 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1"} err="failed to get container status \"29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1\": rpc error: code = NotFound desc = could not find container \"29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1\": container with ID starting with 29bf05d54b6fa3035a3f110a57950728b32d06e6c1b48be2a6566424b783dbd1 not found: ID does not exist" Feb 28 09:05:58 crc kubenswrapper[4687]: I0228 09:05:58.666317 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" path="/var/lib/kubelet/pods/998b35fc-9704-4608-94c8-eccb4ca28857/volumes" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125188 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537826-t7ghx"] Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125571 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="extract-utilities" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125584 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="extract-utilities" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125599 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="extract-content" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125605 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="extract-content" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125612 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f474551-62a6-45fd-ade5-91f6f7c27b87" containerName="pruner" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125618 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f474551-62a6-45fd-ade5-91f6f7c27b87" containerName="pruner" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125626 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="extract-content" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125632 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="extract-content" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125639 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="extract-utilities" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125644 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="extract-utilities" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125656 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="extract-utilities" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125661 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="extract-utilities" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125668 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="extract-content" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125673 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="extract-content" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125679 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125685 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125693 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125698 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: E0228 09:06:00.125706 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125712 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125818 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c495db-6852-4932-996a-053d7c113f22" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125840 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="998b35fc-9704-4608-94c8-eccb4ca28857" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125847 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f474551-62a6-45fd-ade5-91f6f7c27b87" containerName="pruner" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.125857 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f193be8c-c2cf-4d79-ac3d-fed262658077" containerName="registry-server" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.126232 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.128267 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.128269 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.129705 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.133003 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537826-t7ghx"] Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.170401 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7fm\" (UniqueName: \"kubernetes.io/projected/ecbc5046-5a52-46b8-8d92-45e22891bd1d-kube-api-access-zq7fm\") pod \"auto-csr-approver-29537826-t7ghx\" (UID: \"ecbc5046-5a52-46b8-8d92-45e22891bd1d\") " pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.271393 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7fm\" (UniqueName: \"kubernetes.io/projected/ecbc5046-5a52-46b8-8d92-45e22891bd1d-kube-api-access-zq7fm\") pod \"auto-csr-approver-29537826-t7ghx\" (UID: \"ecbc5046-5a52-46b8-8d92-45e22891bd1d\") " pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.287871 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7fm\" (UniqueName: \"kubernetes.io/projected/ecbc5046-5a52-46b8-8d92-45e22891bd1d-kube-api-access-zq7fm\") pod \"auto-csr-approver-29537826-t7ghx\" (UID: \"ecbc5046-5a52-46b8-8d92-45e22891bd1d\") " pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.440107 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:00 crc kubenswrapper[4687]: I0228 09:06:00.816078 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537826-t7ghx"] Feb 28 09:06:00 crc kubenswrapper[4687]: W0228 09:06:00.823273 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecbc5046_5a52_46b8_8d92_45e22891bd1d.slice/crio-83265cc27c47742adff7fd76643569599a9c71dea8f81ddf03be865c75ef9ec2 WatchSource:0}: Error finding container 83265cc27c47742adff7fd76643569599a9c71dea8f81ddf03be865c75ef9ec2: Status 404 returned error can't find the container with id 83265cc27c47742adff7fd76643569599a9c71dea8f81ddf03be865c75ef9ec2 Feb 28 09:06:01 crc kubenswrapper[4687]: I0228 09:06:01.210604 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" event={"ID":"ecbc5046-5a52-46b8-8d92-45e22891bd1d","Type":"ContainerStarted","Data":"83265cc27c47742adff7fd76643569599a9c71dea8f81ddf03be865c75ef9ec2"} Feb 28 09:06:02 crc kubenswrapper[4687]: I0228 09:06:02.672883 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.423328 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.425330 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.428633 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.431573 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.431809 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.454983 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.454962693 podStartE2EDuration="4.454962693s" podCreationTimestamp="2026-02-28 09:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:06:06.453841484 +0000 UTC m=+158.144410821" watchObservedRunningTime="2026-02-28 09:06:06.454962693 +0000 UTC m=+158.145532029" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.553845 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.554051 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6e0421-8332-4c89-bdcb-4406af730891-kube-api-access\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.554093 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-var-lock\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.655207 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6e0421-8332-4c89-bdcb-4406af730891-kube-api-access\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.655268 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-var-lock\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.655344 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.655443 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.655488 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-var-lock\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.675207 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6e0421-8332-4c89-bdcb-4406af730891-kube-api-access\") pod \"installer-9-crc\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:06 crc kubenswrapper[4687]: I0228 09:06:06.742001 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:07 crc kubenswrapper[4687]: I0228 09:06:07.179770 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 09:06:07 crc kubenswrapper[4687]: W0228 09:06:07.187698 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podde6e0421_8332_4c89_bdcb_4406af730891.slice/crio-81ca7f999d26f9e5d0fcd145958b72b30af1114b6d4c85ba4d911c9053c59754 WatchSource:0}: Error finding container 81ca7f999d26f9e5d0fcd145958b72b30af1114b6d4c85ba4d911c9053c59754: Status 404 returned error can't find the container with id 81ca7f999d26f9e5d0fcd145958b72b30af1114b6d4c85ba4d911c9053c59754 Feb 28 09:06:07 crc kubenswrapper[4687]: I0228 09:06:07.265481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de6e0421-8332-4c89-bdcb-4406af730891","Type":"ContainerStarted","Data":"81ca7f999d26f9e5d0fcd145958b72b30af1114b6d4c85ba4d911c9053c59754"} Feb 28 09:06:07 crc kubenswrapper[4687]: I0228 09:06:07.268000 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" event={"ID":"ecbc5046-5a52-46b8-8d92-45e22891bd1d","Type":"ContainerStarted","Data":"0f446d8f8117a09c10f580b771adb69dacf34cd28c37ea0c9ccc12f3d3093445"} Feb 28 09:06:07 crc kubenswrapper[4687]: I0228 09:06:07.281531 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" podStartSLOduration=1.264570719 podStartE2EDuration="7.281515462s" podCreationTimestamp="2026-02-28 09:06:00 +0000 UTC" firstStartedPulling="2026-02-28 09:06:00.825460302 +0000 UTC m=+152.516029639" lastFinishedPulling="2026-02-28 09:06:06.842405045 +0000 UTC m=+158.532974382" observedRunningTime="2026-02-28 09:06:07.278934568 +0000 UTC m=+158.969503904" watchObservedRunningTime="2026-02-28 09:06:07.281515462 +0000 UTC m=+158.972084799" Feb 28 09:06:07 crc kubenswrapper[4687]: I0228 09:06:07.365704 4687 csr.go:261] certificate signing request csr-rvsc7 is approved, waiting to be issued Feb 28 09:06:07 crc kubenswrapper[4687]: I0228 09:06:07.371186 4687 csr.go:257] certificate signing request csr-rvsc7 is issued Feb 28 09:06:08 crc kubenswrapper[4687]: I0228 09:06:08.276194 4687 generic.go:334] "Generic (PLEG): container finished" podID="ecbc5046-5a52-46b8-8d92-45e22891bd1d" containerID="0f446d8f8117a09c10f580b771adb69dacf34cd28c37ea0c9ccc12f3d3093445" exitCode=0 Feb 28 09:06:08 crc kubenswrapper[4687]: I0228 09:06:08.276463 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" event={"ID":"ecbc5046-5a52-46b8-8d92-45e22891bd1d","Type":"ContainerDied","Data":"0f446d8f8117a09c10f580b771adb69dacf34cd28c37ea0c9ccc12f3d3093445"} Feb 28 09:06:08 crc kubenswrapper[4687]: I0228 09:06:08.278070 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de6e0421-8332-4c89-bdcb-4406af730891","Type":"ContainerStarted","Data":"d89880e3617e39c85fd05ec0a5463d54d24e72306d592ca43563c703c98c6587"} Feb 28 09:06:08 crc kubenswrapper[4687]: I0228 09:06:08.308642 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.30861189 podStartE2EDuration="2.30861189s" podCreationTimestamp="2026-02-28 09:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:06:08.304143736 +0000 UTC m=+159.994713073" watchObservedRunningTime="2026-02-28 09:06:08.30861189 +0000 UTC m=+159.999181228" Feb 28 09:06:08 crc kubenswrapper[4687]: I0228 09:06:08.372386 4687 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 14:12:58.020282507 +0000 UTC Feb 28 09:06:08 crc kubenswrapper[4687]: I0228 09:06:08.372427 4687 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7469h6m49.647857529s for next certificate rotation Feb 28 09:06:09 crc kubenswrapper[4687]: I0228 09:06:09.373365 4687 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 14:09:39.868239415 +0000 UTC Feb 28 09:06:09 crc kubenswrapper[4687]: I0228 09:06:09.373660 4687 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7325h3m30.494582337s for next certificate rotation Feb 28 09:06:09 crc kubenswrapper[4687]: I0228 09:06:09.575458 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:09 crc kubenswrapper[4687]: I0228 09:06:09.586926 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq7fm\" (UniqueName: \"kubernetes.io/projected/ecbc5046-5a52-46b8-8d92-45e22891bd1d-kube-api-access-zq7fm\") pod \"ecbc5046-5a52-46b8-8d92-45e22891bd1d\" (UID: \"ecbc5046-5a52-46b8-8d92-45e22891bd1d\") " Feb 28 09:06:09 crc kubenswrapper[4687]: I0228 09:06:09.592091 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbc5046-5a52-46b8-8d92-45e22891bd1d-kube-api-access-zq7fm" (OuterVolumeSpecName: "kube-api-access-zq7fm") pod "ecbc5046-5a52-46b8-8d92-45e22891bd1d" (UID: "ecbc5046-5a52-46b8-8d92-45e22891bd1d"). InnerVolumeSpecName "kube-api-access-zq7fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:06:09 crc kubenswrapper[4687]: I0228 09:06:09.688159 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq7fm\" (UniqueName: \"kubernetes.io/projected/ecbc5046-5a52-46b8-8d92-45e22891bd1d-kube-api-access-zq7fm\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.289966 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" event={"ID":"ecbc5046-5a52-46b8-8d92-45e22891bd1d","Type":"ContainerDied","Data":"83265cc27c47742adff7fd76643569599a9c71dea8f81ddf03be865c75ef9ec2"} Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.290044 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83265cc27c47742adff7fd76643569599a9c71dea8f81ddf03be865c75ef9ec2" Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.290042 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537826-t7ghx" Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.489555 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg"] Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.489814 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" podUID="97fbd2d6-e31b-4c77-9821-99ee5fc4866e" containerName="controller-manager" containerID="cri-o://d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d" gracePeriod=30 Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.500500 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f"] Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.500705 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" podUID="6fe6f8df-8219-45e4-a02b-36b7ad63cf61" containerName="route-controller-manager" containerID="cri-o://f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949" gracePeriod=30 Feb 28 09:06:10 crc kubenswrapper[4687]: I0228 09:06:10.931645 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.021323 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.104727 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-config\") pod \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.104784 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-serving-cert\") pod \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.104884 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-client-ca\") pod \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.104965 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmkxc\" (UniqueName: \"kubernetes.io/projected/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-kube-api-access-vmkxc\") pod \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\" (UID: \"6fe6f8df-8219-45e4-a02b-36b7ad63cf61\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.105679 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fe6f8df-8219-45e4-a02b-36b7ad63cf61" (UID: "6fe6f8df-8219-45e4-a02b-36b7ad63cf61"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.105687 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-config" (OuterVolumeSpecName: "config") pod "6fe6f8df-8219-45e4-a02b-36b7ad63cf61" (UID: "6fe6f8df-8219-45e4-a02b-36b7ad63cf61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.106161 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.106186 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.110233 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-kube-api-access-vmkxc" (OuterVolumeSpecName: "kube-api-access-vmkxc") pod "6fe6f8df-8219-45e4-a02b-36b7ad63cf61" (UID: "6fe6f8df-8219-45e4-a02b-36b7ad63cf61"). InnerVolumeSpecName "kube-api-access-vmkxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.110267 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fe6f8df-8219-45e4-a02b-36b7ad63cf61" (UID: "6fe6f8df-8219-45e4-a02b-36b7ad63cf61"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.206854 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnvgv\" (UniqueName: \"kubernetes.io/projected/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-kube-api-access-pnvgv\") pod \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.206981 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-config\") pod \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207093 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-client-ca\") pod \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207257 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-proxy-ca-bundles\") pod \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207304 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-serving-cert\") pod \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\" (UID: \"97fbd2d6-e31b-4c77-9821-99ee5fc4866e\") " Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207510 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-client-ca" (OuterVolumeSpecName: "client-ca") pod "97fbd2d6-e31b-4c77-9821-99ee5fc4866e" (UID: "97fbd2d6-e31b-4c77-9821-99ee5fc4866e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207565 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmkxc\" (UniqueName: \"kubernetes.io/projected/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-kube-api-access-vmkxc\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207579 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe6f8df-8219-45e4-a02b-36b7ad63cf61-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207662 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-config" (OuterVolumeSpecName: "config") pod "97fbd2d6-e31b-4c77-9821-99ee5fc4866e" (UID: "97fbd2d6-e31b-4c77-9821-99ee5fc4866e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.207727 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97fbd2d6-e31b-4c77-9821-99ee5fc4866e" (UID: "97fbd2d6-e31b-4c77-9821-99ee5fc4866e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.210564 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-kube-api-access-pnvgv" (OuterVolumeSpecName: "kube-api-access-pnvgv") pod "97fbd2d6-e31b-4c77-9821-99ee5fc4866e" (UID: "97fbd2d6-e31b-4c77-9821-99ee5fc4866e"). InnerVolumeSpecName "kube-api-access-pnvgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.210774 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97fbd2d6-e31b-4c77-9821-99ee5fc4866e" (UID: "97fbd2d6-e31b-4c77-9821-99ee5fc4866e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.297203 4687 generic.go:334] "Generic (PLEG): container finished" podID="6fe6f8df-8219-45e4-a02b-36b7ad63cf61" containerID="f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949" exitCode=0 Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.297297 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.297327 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" event={"ID":"6fe6f8df-8219-45e4-a02b-36b7ad63cf61","Type":"ContainerDied","Data":"f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949"} Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.297726 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f" event={"ID":"6fe6f8df-8219-45e4-a02b-36b7ad63cf61","Type":"ContainerDied","Data":"9c898bc47d1dde220e0e3d688771c9bf4d84efe9694fe1131a8ddf08d4db52f9"} Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.297765 4687 scope.go:117] "RemoveContainer" containerID="f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.299504 4687 generic.go:334] "Generic (PLEG): container finished" podID="97fbd2d6-e31b-4c77-9821-99ee5fc4866e" containerID="d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d" exitCode=0 Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.299588 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" event={"ID":"97fbd2d6-e31b-4c77-9821-99ee5fc4866e","Type":"ContainerDied","Data":"d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d"} Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.299628 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.299633 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg" event={"ID":"97fbd2d6-e31b-4c77-9821-99ee5fc4866e","Type":"ContainerDied","Data":"41a902b5c1344142597b418a71e9f72d7b78adb670b773c7a98fd13e68427b3a"} Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.308742 4687 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.309119 4687 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.309142 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnvgv\" (UniqueName: \"kubernetes.io/projected/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-kube-api-access-pnvgv\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.309156 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.309168 4687 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97fbd2d6-e31b-4c77-9821-99ee5fc4866e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.316688 4687 scope.go:117] "RemoveContainer" containerID="f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949" Feb 28 09:06:11 crc kubenswrapper[4687]: E0228 09:06:11.317092 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949\": container with ID starting with f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949 not found: ID does not exist" containerID="f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.317127 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949"} err="failed to get container status \"f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949\": rpc error: code = NotFound desc = could not find container \"f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949\": container with ID starting with f10425c6010b47642cec8340ebadb40027c41bdc6c69ca3992946c657c4fa949 not found: ID does not exist" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.317156 4687 scope.go:117] "RemoveContainer" containerID="d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.325927 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f"] Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.328800 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-676894c655-2fj8f"] Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.340174 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg"] Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.343217 4687 scope.go:117] "RemoveContainer" containerID="d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.343262 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-ffc9d9c9c-tztvg"] Feb 28 09:06:11 crc kubenswrapper[4687]: E0228 09:06:11.343612 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d\": container with ID starting with d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d not found: ID does not exist" containerID="d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d" Feb 28 09:06:11 crc kubenswrapper[4687]: I0228 09:06:11.343709 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d"} err="failed to get container status \"d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d\": rpc error: code = NotFound desc = could not find container \"d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d\": container with ID starting with d456a2161a78e68cb5ef5534411ff7252223f7d5f3e1d85fa4e61cca7062f96d not found: ID does not exist" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351227 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g"] Feb 28 09:06:12 crc kubenswrapper[4687]: E0228 09:06:12.351559 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc5046-5a52-46b8-8d92-45e22891bd1d" containerName="oc" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351573 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc5046-5a52-46b8-8d92-45e22891bd1d" containerName="oc" Feb 28 09:06:12 crc kubenswrapper[4687]: E0228 09:06:12.351593 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fbd2d6-e31b-4c77-9821-99ee5fc4866e" containerName="controller-manager" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351599 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fbd2d6-e31b-4c77-9821-99ee5fc4866e" containerName="controller-manager" Feb 28 09:06:12 crc kubenswrapper[4687]: E0228 09:06:12.351613 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe6f8df-8219-45e4-a02b-36b7ad63cf61" containerName="route-controller-manager" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351619 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe6f8df-8219-45e4-a02b-36b7ad63cf61" containerName="route-controller-manager" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351748 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe6f8df-8219-45e4-a02b-36b7ad63cf61" containerName="route-controller-manager" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351761 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fbd2d6-e31b-4c77-9821-99ee5fc4866e" containerName="controller-manager" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.351775 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbc5046-5a52-46b8-8d92-45e22891bd1d" containerName="oc" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.352374 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.354970 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.356034 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.356069 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.356101 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.356038 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.356757 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7"] Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.356871 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.357421 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.359545 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.359807 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.360080 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.360252 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.360727 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.361322 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.362852 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g"] Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.363678 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.365070 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7"] Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424401 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6097249e-c552-476f-97e5-e27814e2a15c-serving-cert\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424472 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd99cc76-0a6a-425a-8d60-6b838f537677-client-ca\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424540 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q8pl\" (UniqueName: \"kubernetes.io/projected/6097249e-c552-476f-97e5-e27814e2a15c-kube-api-access-7q8pl\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424608 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjxwd\" (UniqueName: \"kubernetes.io/projected/dd99cc76-0a6a-425a-8d60-6b838f537677-kube-api-access-hjxwd\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424634 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd99cc76-0a6a-425a-8d60-6b838f537677-serving-cert\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424760 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-config\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424810 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-client-ca\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424851 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-proxy-ca-bundles\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.424872 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd99cc76-0a6a-425a-8d60-6b838f537677-config\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.525824 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-config\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.525876 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-client-ca\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.525904 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-proxy-ca-bundles\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.525923 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd99cc76-0a6a-425a-8d60-6b838f537677-config\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.525972 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6097249e-c552-476f-97e5-e27814e2a15c-serving-cert\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.526004 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd99cc76-0a6a-425a-8d60-6b838f537677-client-ca\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.526056 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q8pl\" (UniqueName: \"kubernetes.io/projected/6097249e-c552-476f-97e5-e27814e2a15c-kube-api-access-7q8pl\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.526105 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjxwd\" (UniqueName: \"kubernetes.io/projected/dd99cc76-0a6a-425a-8d60-6b838f537677-kube-api-access-hjxwd\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.526133 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd99cc76-0a6a-425a-8d60-6b838f537677-serving-cert\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.527218 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd99cc76-0a6a-425a-8d60-6b838f537677-client-ca\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.527452 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-client-ca\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.527596 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd99cc76-0a6a-425a-8d60-6b838f537677-config\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.527688 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-proxy-ca-bundles\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.528008 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6097249e-c552-476f-97e5-e27814e2a15c-config\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.530452 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd99cc76-0a6a-425a-8d60-6b838f537677-serving-cert\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.532081 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6097249e-c552-476f-97e5-e27814e2a15c-serving-cert\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.540090 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q8pl\" (UniqueName: \"kubernetes.io/projected/6097249e-c552-476f-97e5-e27814e2a15c-kube-api-access-7q8pl\") pod \"controller-manager-b7ccdfcb8-tcfz7\" (UID: \"6097249e-c552-476f-97e5-e27814e2a15c\") " pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.541110 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjxwd\" (UniqueName: \"kubernetes.io/projected/dd99cc76-0a6a-425a-8d60-6b838f537677-kube-api-access-hjxwd\") pod \"route-controller-manager-65c67c455c-4g57g\" (UID: \"dd99cc76-0a6a-425a-8d60-6b838f537677\") " pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.663466 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe6f8df-8219-45e4-a02b-36b7ad63cf61" path="/var/lib/kubelet/pods/6fe6f8df-8219-45e4-a02b-36b7ad63cf61/volumes" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.664144 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97fbd2d6-e31b-4c77-9821-99ee5fc4866e" path="/var/lib/kubelet/pods/97fbd2d6-e31b-4c77-9821-99ee5fc4866e/volumes" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.666338 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:12 crc kubenswrapper[4687]: I0228 09:06:12.671411 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.036230 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7"] Feb 28 09:06:13 crc kubenswrapper[4687]: W0228 09:06:13.042259 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6097249e_c552_476f_97e5_e27814e2a15c.slice/crio-66ffd172339f4f5a886ce549e35422c2fda1d368e45f0102f1c1e4fa8409ca37 WatchSource:0}: Error finding container 66ffd172339f4f5a886ce549e35422c2fda1d368e45f0102f1c1e4fa8409ca37: Status 404 returned error can't find the container with id 66ffd172339f4f5a886ce549e35422c2fda1d368e45f0102f1c1e4fa8409ca37 Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.069914 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g"] Feb 28 09:06:13 crc kubenswrapper[4687]: W0228 09:06:13.075894 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd99cc76_0a6a_425a_8d60_6b838f537677.slice/crio-7bc2e820076c60ea370bcf4d0bf913df6ba94bcc46ea77176e7f67f767e2433f WatchSource:0}: Error finding container 7bc2e820076c60ea370bcf4d0bf913df6ba94bcc46ea77176e7f67f767e2433f: Status 404 returned error can't find the container with id 7bc2e820076c60ea370bcf4d0bf913df6ba94bcc46ea77176e7f67f767e2433f Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.316350 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" event={"ID":"dd99cc76-0a6a-425a-8d60-6b838f537677","Type":"ContainerStarted","Data":"5251c85d4ae2b4794fa40f0d97c5c6dda32c175afcb4ee03e4c57f121101bbfc"} Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.316894 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.316916 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" event={"ID":"dd99cc76-0a6a-425a-8d60-6b838f537677","Type":"ContainerStarted","Data":"7bc2e820076c60ea370bcf4d0bf913df6ba94bcc46ea77176e7f67f767e2433f"} Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.318287 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" event={"ID":"6097249e-c552-476f-97e5-e27814e2a15c","Type":"ContainerStarted","Data":"3d1a9451e6e714c460cce6171c4a49c913d1cb2914ce90e64b69553058afae24"} Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.318340 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" event={"ID":"6097249e-c552-476f-97e5-e27814e2a15c","Type":"ContainerStarted","Data":"66ffd172339f4f5a886ce549e35422c2fda1d368e45f0102f1c1e4fa8409ca37"} Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.318601 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.322391 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.338731 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" podStartSLOduration=3.338716816 podStartE2EDuration="3.338716816s" podCreationTimestamp="2026-02-28 09:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:06:13.337928172 +0000 UTC m=+165.028497510" watchObservedRunningTime="2026-02-28 09:06:13.338716816 +0000 UTC m=+165.029286153" Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.360912 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b7ccdfcb8-tcfz7" podStartSLOduration=3.360892426 podStartE2EDuration="3.360892426s" podCreationTimestamp="2026-02-28 09:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:06:13.357318965 +0000 UTC m=+165.047888302" watchObservedRunningTime="2026-02-28 09:06:13.360892426 +0000 UTC m=+165.051461763" Feb 28 09:06:13 crc kubenswrapper[4687]: I0228 09:06:13.698480 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65c67c455c-4g57g" Feb 28 09:06:15 crc kubenswrapper[4687]: I0228 09:06:15.535185 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zhdhr"] Feb 28 09:06:40 crc kubenswrapper[4687]: I0228 09:06:40.557058 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" podUID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" containerName="oauth-openshift" containerID="cri-o://3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938" gracePeriod=15 Feb 28 09:06:40 crc kubenswrapper[4687]: I0228 09:06:40.989380 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.015240 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66cd4f668b-hgpqw"] Feb 28 09:06:41 crc kubenswrapper[4687]: E0228 09:06:41.015453 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" containerName="oauth-openshift" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.015472 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" containerName="oauth-openshift" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.015577 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" containerName="oauth-openshift" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.016009 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.027783 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66cd4f668b-hgpqw"] Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183127 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-dir\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183194 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-service-ca\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183250 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-router-certs\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183264 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183296 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-provider-selection\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183464 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-session\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183560 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-cliconfig\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183600 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-ocp-branding-template\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183661 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-login\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183694 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-trusted-ca-bundle\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.183757 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q28j9\" (UniqueName: \"kubernetes.io/projected/09d34ddd-da09-46e8-a9d5-5f395dbe8625-kube-api-access-q28j9\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.184495 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.184764 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.184839 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.184865 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-serving-cert\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.184922 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-policies\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.184944 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-error\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.185371 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.185485 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-idp-0-file-data\") pod \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\" (UID: \"09d34ddd-da09-46e8-a9d5-5f395dbe8625\") " Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186241 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-session\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186279 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-login\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186304 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr5n\" (UniqueName: \"kubernetes.io/projected/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-kube-api-access-jxr5n\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186366 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186585 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-error\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186618 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-audit-policies\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186647 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186794 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-service-ca\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186905 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.186996 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-router-certs\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187136 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-audit-dir\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187197 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187283 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187333 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187538 4687 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187573 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187590 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187603 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.187613 4687 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09d34ddd-da09-46e8-a9d5-5f395dbe8625-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.190197 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d34ddd-da09-46e8-a9d5-5f395dbe8625-kube-api-access-q28j9" (OuterVolumeSpecName: "kube-api-access-q28j9") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "kube-api-access-q28j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.190229 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.190524 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.190631 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.190885 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.191102 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.198171 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.198481 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.198637 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "09d34ddd-da09-46e8-a9d5-5f395dbe8625" (UID: "09d34ddd-da09-46e8-a9d5-5f395dbe8625"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.288827 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.288894 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.288918 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.288950 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-session\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.288968 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-login\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.288983 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxr5n\" (UniqueName: \"kubernetes.io/projected/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-kube-api-access-jxr5n\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289007 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289057 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-error\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289078 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-audit-policies\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289095 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289112 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-service-ca\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289131 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289161 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-router-certs\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289184 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-audit-dir\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289220 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289229 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289239 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q28j9\" (UniqueName: \"kubernetes.io/projected/09d34ddd-da09-46e8-a9d5-5f395dbe8625-kube-api-access-q28j9\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289248 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289259 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289269 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289279 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289288 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289297 4687 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09d34ddd-da09-46e8-a9d5-5f395dbe8625-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289328 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-audit-dir\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.289940 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.290160 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-audit-policies\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.290414 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-service-ca\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.290640 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.292432 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.292504 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-login\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.292576 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.292757 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-session\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.292779 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.292915 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-router-certs\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.293364 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.293654 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-v4-0-config-user-template-error\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.302431 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxr5n\" (UniqueName: \"kubernetes.io/projected/2abef9f3-5701-47e8-b065-dcc3de7e0ce5-kube-api-access-jxr5n\") pod \"oauth-openshift-66cd4f668b-hgpqw\" (UID: \"2abef9f3-5701-47e8-b065-dcc3de7e0ce5\") " pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.332942 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.479313 4687 generic.go:334] "Generic (PLEG): container finished" podID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" containerID="3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938" exitCode=0 Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.479378 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" event={"ID":"09d34ddd-da09-46e8-a9d5-5f395dbe8625","Type":"ContainerDied","Data":"3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938"} Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.479409 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" event={"ID":"09d34ddd-da09-46e8-a9d5-5f395dbe8625","Type":"ContainerDied","Data":"e295880c9e670d2e3a379dabb786bc971d87858f4b4c0b7b624f5ef110248d08"} Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.479444 4687 scope.go:117] "RemoveContainer" containerID="3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.479569 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zhdhr" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.500300 4687 scope.go:117] "RemoveContainer" containerID="3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938" Feb 28 09:06:41 crc kubenswrapper[4687]: E0228 09:06:41.514903 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938\": container with ID starting with 3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938 not found: ID does not exist" containerID="3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.514987 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938"} err="failed to get container status \"3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938\": rpc error: code = NotFound desc = could not find container \"3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938\": container with ID starting with 3117d5bf7cf7e3d05a65694144dbdd3fb7db368e1b4c3616f23ac8346ea7c938 not found: ID does not exist" Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.521176 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zhdhr"] Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.523261 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zhdhr"] Feb 28 09:06:41 crc kubenswrapper[4687]: I0228 09:06:41.696670 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66cd4f668b-hgpqw"] Feb 28 09:06:42 crc kubenswrapper[4687]: I0228 09:06:42.490841 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" event={"ID":"2abef9f3-5701-47e8-b065-dcc3de7e0ce5","Type":"ContainerStarted","Data":"23d986c2f119f705133735025e858d5b1142958a9f410b252ed1617dacdcf3c1"} Feb 28 09:06:42 crc kubenswrapper[4687]: I0228 09:06:42.490908 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" event={"ID":"2abef9f3-5701-47e8-b065-dcc3de7e0ce5","Type":"ContainerStarted","Data":"12be65f7e1d15163619f989c19158b7543157d5d7b0b0d3f300a4a48c4c442af"} Feb 28 09:06:42 crc kubenswrapper[4687]: I0228 09:06:42.491268 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:42 crc kubenswrapper[4687]: I0228 09:06:42.496078 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" Feb 28 09:06:42 crc kubenswrapper[4687]: I0228 09:06:42.509079 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66cd4f668b-hgpqw" podStartSLOduration=27.509045388 podStartE2EDuration="27.509045388s" podCreationTimestamp="2026-02-28 09:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:06:42.508362353 +0000 UTC m=+194.198931690" watchObservedRunningTime="2026-02-28 09:06:42.509045388 +0000 UTC m=+194.199614725" Feb 28 09:06:42 crc kubenswrapper[4687]: I0228 09:06:42.661862 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d34ddd-da09-46e8-a9d5-5f395dbe8625" path="/var/lib/kubelet/pods/09d34ddd-da09-46e8-a9d5-5f395dbe8625/volumes" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.081614 4687 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.082403 4687 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.082576 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.083596 4687 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.083971 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.083997 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084010 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084034 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084047 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084057 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084070 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084077 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084089 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084096 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084108 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084114 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084131 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084139 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084157 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084163 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084298 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084314 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084322 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084331 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084340 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084350 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084357 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084366 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084374 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084508 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084517 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.084528 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.084534 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.116480 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136123 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136257 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136315 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136349 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136381 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136432 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136532 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.136609 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238600 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238689 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238722 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238740 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238774 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238752 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238808 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238824 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238870 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238903 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238983 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238998 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.238993 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.239096 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.239115 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.239149 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.412486 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.515714 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2cc433b4847702208af36decafd3da711eb1441505e6858da66a173f065ddfe5"} Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.517769 4687 generic.go:334] "Generic (PLEG): container finished" podID="de6e0421-8332-4c89-bdcb-4406af730891" containerID="d89880e3617e39c85fd05ec0a5463d54d24e72306d592ca43563c703c98c6587" exitCode=0 Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.517866 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de6e0421-8332-4c89-bdcb-4406af730891","Type":"ContainerDied","Data":"d89880e3617e39c85fd05ec0a5463d54d24e72306d592ca43563c703c98c6587"} Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.518083 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37" gracePeriod=15 Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.518125 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9" gracePeriod=15 Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.518178 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1" gracePeriod=15 Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.518197 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685" gracePeriod=15 Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.518231 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568" gracePeriod=15 Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.532735 4687 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.657352 4687 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18985dd1035da6c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:06:45.656307395 +0000 UTC m=+197.346876731,LastTimestamp:2026-02-28 09:06:45.656307395 +0000 UTC m=+197.346876731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.744498 4687 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.745113 4687 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.745978 4687 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.746627 4687 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.747124 4687 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:45 crc kubenswrapper[4687]: I0228 09:06:45.747161 4687 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.747483 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="200ms" Feb 28 09:06:45 crc kubenswrapper[4687]: E0228 09:06:45.949051 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="400ms" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.043681 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:06:46Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:06:46Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:06:46Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T09:06:46Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.044110 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.044439 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.044791 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.045143 4687 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.045173 4687 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.349781 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="800ms" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.525547 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"904bce251d642b9e7421cc50ace35d8fe7dd220c90ec2de36ee8a5f20fe6f7e0"} Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.526345 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.528107 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.529805 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.530554 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9" exitCode=0 Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.530581 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568" exitCode=0 Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.530589 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685" exitCode=0 Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.530598 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1" exitCode=2 Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.530673 4687 scope.go:117] "RemoveContainer" containerID="110dc193591d77cad10858a579d47ef5c71456399bf60b68f6b36dc40fc19406" Feb 28 09:06:46 crc kubenswrapper[4687]: E0228 09:06:46.716877 4687 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18985dd1035da6c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:06:45.656307395 +0000 UTC m=+197.346876731,LastTimestamp:2026-02-28 09:06:45.656307395 +0000 UTC m=+197.346876731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.807482 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.808049 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.808465 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.958871 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-kubelet-dir\") pod \"de6e0421-8332-4c89-bdcb-4406af730891\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.959003 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de6e0421-8332-4c89-bdcb-4406af730891" (UID: "de6e0421-8332-4c89-bdcb-4406af730891"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.959068 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-var-lock\") pod \"de6e0421-8332-4c89-bdcb-4406af730891\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.959122 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6e0421-8332-4c89-bdcb-4406af730891-kube-api-access\") pod \"de6e0421-8332-4c89-bdcb-4406af730891\" (UID: \"de6e0421-8332-4c89-bdcb-4406af730891\") " Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.959139 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-var-lock" (OuterVolumeSpecName: "var-lock") pod "de6e0421-8332-4c89-bdcb-4406af730891" (UID: "de6e0421-8332-4c89-bdcb-4406af730891"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.959585 4687 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.959609 4687 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6e0421-8332-4c89-bdcb-4406af730891-var-lock\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:46 crc kubenswrapper[4687]: I0228 09:06:46.965887 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6e0421-8332-4c89-bdcb-4406af730891-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de6e0421-8332-4c89-bdcb-4406af730891" (UID: "de6e0421-8332-4c89-bdcb-4406af730891"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.060761 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6e0421-8332-4c89-bdcb-4406af730891-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:47 crc kubenswrapper[4687]: E0228 09:06:47.150999 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="1.6s" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.541155 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.543259 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de6e0421-8332-4c89-bdcb-4406af730891","Type":"ContainerDied","Data":"81ca7f999d26f9e5d0fcd145958b72b30af1114b6d4c85ba4d911c9053c59754"} Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.543312 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.543336 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ca7f999d26f9e5d0fcd145958b72b30af1114b6d4c85ba4d911c9053c59754" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.555657 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.555903 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.815739 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.816943 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.817681 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.818254 4687 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.818741 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.869533 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.869582 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.869605 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.869642 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.869687 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.869736 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.870143 4687 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.870171 4687 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:47 crc kubenswrapper[4687]: I0228 09:06:47.870185 4687 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.550838 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.551473 4687 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37" exitCode=0 Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.551546 4687 scope.go:117] "RemoveContainer" containerID="822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.551596 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.562470 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.562772 4687 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.563057 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.565558 4687 scope.go:117] "RemoveContainer" containerID="b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.575705 4687 scope.go:117] "RemoveContainer" containerID="10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.588857 4687 scope.go:117] "RemoveContainer" containerID="b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.600655 4687 scope.go:117] "RemoveContainer" containerID="8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.623678 4687 scope.go:117] "RemoveContainer" containerID="0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.640324 4687 scope.go:117] "RemoveContainer" containerID="822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.641396 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9\": container with ID starting with 822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9 not found: ID does not exist" containerID="822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.641455 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9"} err="failed to get container status \"822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9\": rpc error: code = NotFound desc = could not find container \"822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9\": container with ID starting with 822b6e44a39f66516e6e85496812470f40eecfb6bb6081cc4b35583fbb308dd9 not found: ID does not exist" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.641485 4687 scope.go:117] "RemoveContainer" containerID="b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.642238 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568\": container with ID starting with b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568 not found: ID does not exist" containerID="b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.642282 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568"} err="failed to get container status \"b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568\": rpc error: code = NotFound desc = could not find container \"b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568\": container with ID starting with b7c0ec64d57de434662d5c2bf49e0665706fa98f05029eec8887a252174a9568 not found: ID does not exist" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.642300 4687 scope.go:117] "RemoveContainer" containerID="10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.642609 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685\": container with ID starting with 10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685 not found: ID does not exist" containerID="10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.642646 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685"} err="failed to get container status \"10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685\": rpc error: code = NotFound desc = could not find container \"10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685\": container with ID starting with 10ac076b0cc2345aec565565d566ee0f441ff79c4dc84fc36481e148f5ada685 not found: ID does not exist" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.642676 4687 scope.go:117] "RemoveContainer" containerID="b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.643089 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1\": container with ID starting with b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1 not found: ID does not exist" containerID="b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.643145 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1"} err="failed to get container status \"b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1\": rpc error: code = NotFound desc = could not find container \"b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1\": container with ID starting with b07f07f4633ccdd6561d83bf0f20f0800cf1b938d203767f0d27c5acb3fd5aa1 not found: ID does not exist" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.643163 4687 scope.go:117] "RemoveContainer" containerID="8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.643816 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37\": container with ID starting with 8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37 not found: ID does not exist" containerID="8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.643842 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37"} err="failed to get container status \"8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37\": rpc error: code = NotFound desc = could not find container \"8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37\": container with ID starting with 8c587833f88ced400be887fea9b2db8c115b30eded6c6bf9db0d5f1fc87c6c37 not found: ID does not exist" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.643867 4687 scope.go:117] "RemoveContainer" containerID="0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.644134 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb\": container with ID starting with 0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb not found: ID does not exist" containerID="0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.644162 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb"} err="failed to get container status \"0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb\": rpc error: code = NotFound desc = could not find container \"0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb\": container with ID starting with 0aa898df0d68b69e98c254ee6873db17db552db751ed7c2905aa5036dc86badb not found: ID does not exist" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.658440 4687 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.658744 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.659053 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:48 crc kubenswrapper[4687]: I0228 09:06:48.662801 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 28 09:06:48 crc kubenswrapper[4687]: E0228 09:06:48.751711 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="3.2s" Feb 28 09:06:51 crc kubenswrapper[4687]: E0228 09:06:51.952847 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="6.4s" Feb 28 09:06:55 crc kubenswrapper[4687]: I0228 09:06:55.002271 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:06:55 crc kubenswrapper[4687]: I0228 09:06:55.002355 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:06:56 crc kubenswrapper[4687]: E0228 09:06:56.718192 4687 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18985dd1035da6c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 09:06:45.656307395 +0000 UTC m=+197.346876731,LastTimestamp:2026-02-28 09:06:45.656307395 +0000 UTC m=+197.346876731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 09:06:57 crc kubenswrapper[4687]: I0228 09:06:57.656299 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:57 crc kubenswrapper[4687]: I0228 09:06:57.657212 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:57 crc kubenswrapper[4687]: I0228 09:06:57.657685 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:57 crc kubenswrapper[4687]: I0228 09:06:57.669685 4687 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:06:57 crc kubenswrapper[4687]: I0228 09:06:57.669928 4687 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:06:57 crc kubenswrapper[4687]: E0228 09:06:57.670297 4687 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:57 crc kubenswrapper[4687]: I0228 09:06:57.670824 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:58 crc kubenswrapper[4687]: E0228 09:06:58.355162 4687 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.194:6443: connect: connection refused" interval="7s" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.608993 4687 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4ad96f7661f1812f90eea691451bc7c0f76068171061b1258e61a3b69c978921" exitCode=0 Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.609091 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4ad96f7661f1812f90eea691451bc7c0f76068171061b1258e61a3b69c978921"} Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.609350 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1645a5a5920f7cab1e5b7d288e0cc21932676f7603eb67e701e449322d53108c"} Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.609620 4687 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.609635 4687 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.610062 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:58 crc kubenswrapper[4687]: E0228 09:06:58.610192 4687 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.610322 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.661200 4687 status_manager.go:851] "Failed to get status for pod" podUID="de6e0421-8332-4c89-bdcb-4406af730891" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.661623 4687 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:58 crc kubenswrapper[4687]: I0228 09:06:58.662152 4687 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 192.168.25.194:6443: connect: connection refused" Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.618793 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b88c9b5408b01f8d80cee83ea705a55579178c61d4f6dbea78ae4cecab878185"} Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619115 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"04e6c5d7d2a9f170d0c4e21e964aa585618a2a5892ab7dd4b7dc004c45a2eb65"} Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619131 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8681d58a1e500c0055f51d4e4e4df11c77f5df72dd38a80f1c79d29f7a5e76b0"} Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619141 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cf7b83e17834ecb52ee93270a7c1a1b6052a11d95e94b13dfb670e07bfc109ee"} Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619150 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3398737a73047315f92ec58bc389d59d73b4574b61ed0ea8488efbad24734816"} Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619420 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619444 4687 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:06:59 crc kubenswrapper[4687]: I0228 09:06:59.619474 4687 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:07:00 crc kubenswrapper[4687]: E0228 09:07:00.261608 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c78e20c2e56b7da1c8015367caf37186de5ea7675b3dcf696233ed14753e0d2f.scope\": RecentStats: unable to find data in memory cache]" Feb 28 09:07:00 crc kubenswrapper[4687]: I0228 09:07:00.625828 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 09:07:00 crc kubenswrapper[4687]: I0228 09:07:00.625886 4687 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c78e20c2e56b7da1c8015367caf37186de5ea7675b3dcf696233ed14753e0d2f" exitCode=1 Feb 28 09:07:00 crc kubenswrapper[4687]: I0228 09:07:00.625919 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c78e20c2e56b7da1c8015367caf37186de5ea7675b3dcf696233ed14753e0d2f"} Feb 28 09:07:00 crc kubenswrapper[4687]: I0228 09:07:00.626534 4687 scope.go:117] "RemoveContainer" containerID="c78e20c2e56b7da1c8015367caf37186de5ea7675b3dcf696233ed14753e0d2f" Feb 28 09:07:01 crc kubenswrapper[4687]: I0228 09:07:01.637226 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 09:07:01 crc kubenswrapper[4687]: I0228 09:07:01.638244 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37ff944862175cb99543a86971deec9b07b5c8bbdf72f3a8214bb26cf4238a84"} Feb 28 09:07:02 crc kubenswrapper[4687]: I0228 09:07:02.671131 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:07:02 crc kubenswrapper[4687]: I0228 09:07:02.671187 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:07:02 crc kubenswrapper[4687]: I0228 09:07:02.681296 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:07:04 crc kubenswrapper[4687]: I0228 09:07:04.975413 4687 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:07:05 crc kubenswrapper[4687]: I0228 09:07:05.001672 4687 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3860cf0-4834-4fdc-8f65-4b34dc4b907b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:06:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:06:58Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:06:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T09:06:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3398737a73047315f92ec58bc389d59d73b4574b61ed0ea8488efbad24734816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8681d58a1e500c0055f51d4e4e4df11c77f5df72dd38a80f1c79d29f7a5e76b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7b83e17834ecb52ee93270a7c1a1b6052a11d95e94b13dfb670e07bfc109ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b88c9b5408b01f8d80cee83ea705a55579178c61d4f6dbea78ae4cecab878185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e6c5d7d2a9f170d0c4e21e964aa585618a2a5892ab7dd4b7dc004c45a2eb65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T09:06:59Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad96f7661f1812f90eea691451bc7c0f76068171061b1258e61a3b69c978921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ad96f7661f1812f90eea691451bc7c0f76068171061b1258e61a3b69c978921\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T09:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T09:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"c3860cf0-4834-4fdc-8f65-4b34dc4b907b\": field is immutable" Feb 28 09:07:05 crc kubenswrapper[4687]: I0228 09:07:05.025312 4687 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2202daba-fc6c-4952-93d7-4315043c4e87" Feb 28 09:07:05 crc kubenswrapper[4687]: I0228 09:07:05.661753 4687 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:07:05 crc kubenswrapper[4687]: I0228 09:07:05.661810 4687 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3860cf0-4834-4fdc-8f65-4b34dc4b907b" Feb 28 09:07:05 crc kubenswrapper[4687]: I0228 09:07:05.664558 4687 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2202daba-fc6c-4952-93d7-4315043c4e87" Feb 28 09:07:06 crc kubenswrapper[4687]: I0228 09:07:06.963467 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:07:06 crc kubenswrapper[4687]: I0228 09:07:06.963793 4687 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 28 09:07:06 crc kubenswrapper[4687]: I0228 09:07:06.964347 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 28 09:07:09 crc kubenswrapper[4687]: I0228 09:07:09.959180 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.056230 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.058524 4687 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.069104 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.337582 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.344338 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.430670 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.968256 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.970600 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.972040 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 09:07:16 crc kubenswrapper[4687]: I0228 09:07:16.994929 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 28 09:07:17 crc kubenswrapper[4687]: I0228 09:07:17.353760 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 28 09:07:17 crc kubenswrapper[4687]: I0228 09:07:17.748801 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.145689 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.320434 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.554996 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.826433 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.902854 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.911468 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 28 09:07:18 crc kubenswrapper[4687]: I0228 09:07:18.962627 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.099930 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.104941 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.198719 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.209450 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.322355 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.431600 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.602263 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.613451 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.657257 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.683781 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.794349 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.961811 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 28 09:07:19 crc kubenswrapper[4687]: I0228 09:07:19.961823 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.024887 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.045171 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.257044 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.262666 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.313502 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.346346 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.416048 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.477850 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.578333 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.668187 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.686639 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.708677 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.718190 4687 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.830104 4687 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.930859 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 28 09:07:20 crc kubenswrapper[4687]: I0228 09:07:20.992111 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.041235 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.069397 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.087968 4687 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.136742 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.206699 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.239908 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.252011 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.320514 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.326983 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.407459 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.418947 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.498665 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.530568 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.547319 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.559586 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.595568 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.613248 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.840831 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 28 09:07:21 crc kubenswrapper[4687]: I0228 09:07:21.911699 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.056500 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.065341 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.072535 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.127216 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.158904 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.183435 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.209290 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.234351 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.246189 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.250649 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.358113 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.414718 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.446118 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.507603 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.532851 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.590280 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.636714 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.682932 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.737700 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.738086 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.738576 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.773895 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.780842 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.829327 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.880735 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 28 09:07:22 crc kubenswrapper[4687]: I0228 09:07:22.982925 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.085438 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.086379 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.092376 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.178773 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.234572 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.267009 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.294941 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.324315 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.335613 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.381355 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.388881 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.452236 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.489365 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.527427 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.535959 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.550313 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.558604 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.609058 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.696530 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.787197 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.794369 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.841455 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.963411 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 28 09:07:23 crc kubenswrapper[4687]: I0228 09:07:23.974980 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.108164 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.130781 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.147291 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.242433 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.300666 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.396542 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.484352 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.509657 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.548313 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.574859 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.574868 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.632401 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.633885 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.634898 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.643322 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.773817 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.787487 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.834380 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 28 09:07:24 crc kubenswrapper[4687]: I0228 09:07:24.904079 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.002349 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.002425 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.030492 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.126522 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.231612 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.236295 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.249805 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.295334 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.319723 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.353457 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.385120 4687 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.385835 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.38581294 podStartE2EDuration="40.38581294s" podCreationTimestamp="2026-02-28 09:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:07:05.011753731 +0000 UTC m=+216.702323068" watchObservedRunningTime="2026-02-28 09:07:25.38581294 +0000 UTC m=+237.076382277" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.390194 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.390263 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.395356 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.395970 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.400546 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.406646 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.406631739 podStartE2EDuration="21.406631739s" podCreationTimestamp="2026-02-28 09:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:07:25.404037719 +0000 UTC m=+237.094607057" watchObservedRunningTime="2026-02-28 09:07:25.406631739 +0000 UTC m=+237.097201065" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.439126 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.440909 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.559827 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.579277 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.609339 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.641547 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.685095 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.693596 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.784434 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.892461 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.940169 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 28 09:07:25 crc kubenswrapper[4687]: I0228 09:07:25.940559 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.020317 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.023644 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.086919 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.243885 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.411171 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.416703 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.468378 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.540355 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.583585 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.608166 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.641525 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.747830 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.771968 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.797833 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.798298 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.848309 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.879061 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 28 09:07:26 crc kubenswrapper[4687]: I0228 09:07:26.910420 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.084329 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.154618 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.331631 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.340513 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.439898 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.461613 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.531225 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.595103 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.649812 4687 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.650163 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://904bce251d642b9e7421cc50ace35d8fe7dd220c90ec2de36ee8a5f20fe6f7e0" gracePeriod=5 Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.661740 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.759371 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.824131 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.860838 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.908722 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 28 09:07:27 crc kubenswrapper[4687]: I0228 09:07:27.949431 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.193726 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.255452 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.290805 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.308108 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.347818 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.406111 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.425921 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.487302 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.533993 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.594958 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.641537 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.665943 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.729232 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.753295 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.832208 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.976894 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.982212 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.983716 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 28 09:07:28 crc kubenswrapper[4687]: I0228 09:07:28.985159 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.024846 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.161176 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.161566 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.199797 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.238667 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.319114 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.472296 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.514736 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.568708 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.661978 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.663053 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.773181 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 28 09:07:29 crc kubenswrapper[4687]: I0228 09:07:29.801489 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.452031 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.616120 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.620040 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.626540 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.745248 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.804925 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.841988 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.959384 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 28 09:07:30 crc kubenswrapper[4687]: I0228 09:07:30.998471 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 28 09:07:31 crc kubenswrapper[4687]: I0228 09:07:31.162645 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 28 09:07:31 crc kubenswrapper[4687]: I0228 09:07:31.219733 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 28 09:07:31 crc kubenswrapper[4687]: I0228 09:07:31.294454 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 28 09:07:31 crc kubenswrapper[4687]: I0228 09:07:31.297117 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 28 09:07:31 crc kubenswrapper[4687]: I0228 09:07:31.377789 4687 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 28 09:07:31 crc kubenswrapper[4687]: I0228 09:07:31.914651 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.002501 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.271948 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.392903 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.802488 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.802676 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.814866 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.814945 4687 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="904bce251d642b9e7421cc50ace35d8fe7dd220c90ec2de36ee8a5f20fe6f7e0" exitCode=137 Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.818041 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.822450 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 28 09:07:32 crc kubenswrapper[4687]: I0228 09:07:32.886809 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.218180 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.218280 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.302924 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.346626 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.351568 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.394175 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396221 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396280 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396314 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396333 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396355 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396378 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396393 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396414 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.396461 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.397082 4687 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.397106 4687 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.397116 4687 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.397127 4687 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.404688 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.478383 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.498212 4687 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.684576 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.821110 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.821213 4687 scope.go:117] "RemoveContainer" containerID="904bce251d642b9e7421cc50ace35d8fe7dd220c90ec2de36ee8a5f20fe6f7e0" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.821261 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 09:07:33 crc kubenswrapper[4687]: I0228 09:07:33.910581 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.260957 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.663486 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.664051 4687 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.674928 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.674980 4687 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c272ee7-48fe-47ea-bfae-87d950404e2b" Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.678231 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 09:07:34 crc kubenswrapper[4687]: I0228 09:07:34.678268 4687 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c272ee7-48fe-47ea-bfae-87d950404e2b" Feb 28 09:07:35 crc kubenswrapper[4687]: I0228 09:07:35.992633 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.002537 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.003125 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.003188 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.003942 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.004003 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d" gracePeriod=600 Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.915152 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d" exitCode=0 Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.915185 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d"} Feb 28 09:07:55 crc kubenswrapper[4687]: I0228 09:07:55.915602 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"fad4ec2f45b132fa1fcbba9b5a4c5891531193748a3177bf121c290113487ba4"} Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.149922 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537828-kf7k8"] Feb 28 09:08:00 crc kubenswrapper[4687]: E0228 09:08:00.150776 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.150793 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 09:08:00 crc kubenswrapper[4687]: E0228 09:08:00.150810 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6e0421-8332-4c89-bdcb-4406af730891" containerName="installer" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.150818 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6e0421-8332-4c89-bdcb-4406af730891" containerName="installer" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.150968 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6e0421-8332-4c89-bdcb-4406af730891" containerName="installer" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.150982 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.151548 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.153347 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.154063 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.154800 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.156916 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537828-kf7k8"] Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.299633 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25q5n\" (UniqueName: \"kubernetes.io/projected/f99cb978-113d-4662-b69e-04425f442f83-kube-api-access-25q5n\") pod \"auto-csr-approver-29537828-kf7k8\" (UID: \"f99cb978-113d-4662-b69e-04425f442f83\") " pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.401763 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25q5n\" (UniqueName: \"kubernetes.io/projected/f99cb978-113d-4662-b69e-04425f442f83-kube-api-access-25q5n\") pod \"auto-csr-approver-29537828-kf7k8\" (UID: \"f99cb978-113d-4662-b69e-04425f442f83\") " pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.418576 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25q5n\" (UniqueName: \"kubernetes.io/projected/f99cb978-113d-4662-b69e-04425f442f83-kube-api-access-25q5n\") pod \"auto-csr-approver-29537828-kf7k8\" (UID: \"f99cb978-113d-4662-b69e-04425f442f83\") " pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.468238 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.832466 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537828-kf7k8"] Feb 28 09:08:00 crc kubenswrapper[4687]: W0228 09:08:00.840922 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf99cb978_113d_4662_b69e_04425f442f83.slice/crio-792c0e7299f042ebd23b723facd94336eea8e2dc5aa96ef0165d3fc9f755b8a0 WatchSource:0}: Error finding container 792c0e7299f042ebd23b723facd94336eea8e2dc5aa96ef0165d3fc9f755b8a0: Status 404 returned error can't find the container with id 792c0e7299f042ebd23b723facd94336eea8e2dc5aa96ef0165d3fc9f755b8a0 Feb 28 09:08:00 crc kubenswrapper[4687]: I0228 09:08:00.938976 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" event={"ID":"f99cb978-113d-4662-b69e-04425f442f83","Type":"ContainerStarted","Data":"792c0e7299f042ebd23b723facd94336eea8e2dc5aa96ef0165d3fc9f755b8a0"} Feb 28 09:08:01 crc kubenswrapper[4687]: I0228 09:08:01.951603 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" event={"ID":"f99cb978-113d-4662-b69e-04425f442f83","Type":"ContainerStarted","Data":"65b8e29dcdb142ad307015f50a6b7f202bd2872d87fa74ea449f42cb0938bde9"} Feb 28 09:08:01 crc kubenswrapper[4687]: I0228 09:08:01.963662 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" podStartSLOduration=1.090997453 podStartE2EDuration="1.963646436s" podCreationTimestamp="2026-02-28 09:08:00 +0000 UTC" firstStartedPulling="2026-02-28 09:08:00.844984962 +0000 UTC m=+272.535554299" lastFinishedPulling="2026-02-28 09:08:01.717633945 +0000 UTC m=+273.408203282" observedRunningTime="2026-02-28 09:08:01.963166462 +0000 UTC m=+273.653735800" watchObservedRunningTime="2026-02-28 09:08:01.963646436 +0000 UTC m=+273.654215773" Feb 28 09:08:02 crc kubenswrapper[4687]: I0228 09:08:02.958837 4687 generic.go:334] "Generic (PLEG): container finished" podID="f99cb978-113d-4662-b69e-04425f442f83" containerID="65b8e29dcdb142ad307015f50a6b7f202bd2872d87fa74ea449f42cb0938bde9" exitCode=0 Feb 28 09:08:02 crc kubenswrapper[4687]: I0228 09:08:02.958940 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" event={"ID":"f99cb978-113d-4662-b69e-04425f442f83","Type":"ContainerDied","Data":"65b8e29dcdb142ad307015f50a6b7f202bd2872d87fa74ea449f42cb0938bde9"} Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.174332 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.359800 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25q5n\" (UniqueName: \"kubernetes.io/projected/f99cb978-113d-4662-b69e-04425f442f83-kube-api-access-25q5n\") pod \"f99cb978-113d-4662-b69e-04425f442f83\" (UID: \"f99cb978-113d-4662-b69e-04425f442f83\") " Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.364637 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99cb978-113d-4662-b69e-04425f442f83-kube-api-access-25q5n" (OuterVolumeSpecName: "kube-api-access-25q5n") pod "f99cb978-113d-4662-b69e-04425f442f83" (UID: "f99cb978-113d-4662-b69e-04425f442f83"). InnerVolumeSpecName "kube-api-access-25q5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.460915 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25q5n\" (UniqueName: \"kubernetes.io/projected/f99cb978-113d-4662-b69e-04425f442f83-kube-api-access-25q5n\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.969597 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" event={"ID":"f99cb978-113d-4662-b69e-04425f442f83","Type":"ContainerDied","Data":"792c0e7299f042ebd23b723facd94336eea8e2dc5aa96ef0165d3fc9f755b8a0"} Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.969641 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="792c0e7299f042ebd23b723facd94336eea8e2dc5aa96ef0165d3fc9f755b8a0" Feb 28 09:08:04 crc kubenswrapper[4687]: I0228 09:08:04.969658 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537828-kf7k8" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.571922 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nkgl2"] Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.574681 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nkgl2" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="registry-server" containerID="cri-o://5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118" gracePeriod=30 Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.585635 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7sr6"] Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.585868 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7sr6" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="registry-server" containerID="cri-o://7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd" gracePeriod=30 Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.592610 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5xt25"] Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.592825 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerName="marketplace-operator" containerID="cri-o://85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985" gracePeriod=30 Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.609176 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svtsw"] Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.609444 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-svtsw" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="registry-server" containerID="cri-o://c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a" gracePeriod=30 Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.620315 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npwxl"] Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.620668 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npwxl" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="registry-server" containerID="cri-o://65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736" gracePeriod=30 Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.624655 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhc57"] Feb 28 09:08:21 crc kubenswrapper[4687]: E0228 09:08:21.624920 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99cb978-113d-4662-b69e-04425f442f83" containerName="oc" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.624940 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99cb978-113d-4662-b69e-04425f442f83" containerName="oc" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.625050 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99cb978-113d-4662-b69e-04425f442f83" containerName="oc" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.625539 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.633350 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhc57"] Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.703291 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wwbh\" (UniqueName: \"kubernetes.io/projected/e9586004-7da3-41d4-980d-825eafe37f51-kube-api-access-9wwbh\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.703392 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9586004-7da3-41d4-980d-825eafe37f51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.703447 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e9586004-7da3-41d4-980d-825eafe37f51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.804440 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9586004-7da3-41d4-980d-825eafe37f51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.804512 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e9586004-7da3-41d4-980d-825eafe37f51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.804614 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wwbh\" (UniqueName: \"kubernetes.io/projected/e9586004-7da3-41d4-980d-825eafe37f51-kube-api-access-9wwbh\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.806262 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9586004-7da3-41d4-980d-825eafe37f51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.812933 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e9586004-7da3-41d4-980d-825eafe37f51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.820942 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wwbh\" (UniqueName: \"kubernetes.io/projected/e9586004-7da3-41d4-980d-825eafe37f51-kube-api-access-9wwbh\") pod \"marketplace-operator-79b997595-qhc57\" (UID: \"e9586004-7da3-41d4-980d-825eafe37f51\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.960627 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:21 crc kubenswrapper[4687]: I0228 09:08:21.965237 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.006354 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.011367 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.023423 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.024470 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.075737 4687 generic.go:334] "Generic (PLEG): container finished" podID="19def7b9-fb5d-4e49-98db-784814aa9769" containerID="7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd" exitCode=0 Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.075819 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7sr6" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.075832 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7sr6" event={"ID":"19def7b9-fb5d-4e49-98db-784814aa9769","Type":"ContainerDied","Data":"7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.075900 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7sr6" event={"ID":"19def7b9-fb5d-4e49-98db-784814aa9769","Type":"ContainerDied","Data":"e1bab61eeb1ab6f618584924a2c5e62e42dbec831ffc6cd5dbec9187740bca27"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.075942 4687 scope.go:117] "RemoveContainer" containerID="7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.077686 4687 generic.go:334] "Generic (PLEG): container finished" podID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerID="85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985" exitCode=0 Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.077737 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.077755 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" event={"ID":"36a32d28-84e1-4c44-b2e5-546c8a1c8853","Type":"ContainerDied","Data":"85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.077861 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5xt25" event={"ID":"36a32d28-84e1-4c44-b2e5-546c8a1c8853","Type":"ContainerDied","Data":"593598127fe594baf35099293c658e6c1477e5aab712b4346785b542fc00758c"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.080499 4687 generic.go:334] "Generic (PLEG): container finished" podID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerID="65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736" exitCode=0 Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.080544 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwxl" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.080574 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerDied","Data":"65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.080593 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwxl" event={"ID":"69eb70ff-d8c7-4dba-9f8e-1969b7947640","Type":"ContainerDied","Data":"b5eefc022d09438f17d293a881385620a898ee994dc4031e3ed0c27460bf8e06"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.086623 4687 generic.go:334] "Generic (PLEG): container finished" podID="556a0190-2912-4b71-a5ae-70c614769f9d" containerID="5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118" exitCode=0 Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.086687 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkgl2" event={"ID":"556a0190-2912-4b71-a5ae-70c614769f9d","Type":"ContainerDied","Data":"5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.086723 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkgl2" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.086747 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkgl2" event={"ID":"556a0190-2912-4b71-a5ae-70c614769f9d","Type":"ContainerDied","Data":"85c58770d4bc3b236df8ddb1c1b1bb88ad29cde19f8389f84330e429a725cb2d"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.090464 4687 generic.go:334] "Generic (PLEG): container finished" podID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerID="c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a" exitCode=0 Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.090499 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svtsw" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.090504 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svtsw" event={"ID":"9a9c467e-d2ff-4322-bc25-5cfe38dff784","Type":"ContainerDied","Data":"c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.090648 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svtsw" event={"ID":"9a9c467e-d2ff-4322-bc25-5cfe38dff784","Type":"ContainerDied","Data":"afb5a525dca570e091fd8a08e97d3c21592dddb59f94b5917f1d7ccb0a3e32da"} Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.096150 4687 scope.go:117] "RemoveContainer" containerID="3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.109154 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-catalog-content\") pod \"556a0190-2912-4b71-a5ae-70c614769f9d\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.109243 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdlxb\" (UniqueName: \"kubernetes.io/projected/36a32d28-84e1-4c44-b2e5-546c8a1c8853-kube-api-access-pdlxb\") pod \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.109277 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-utilities\") pod \"556a0190-2912-4b71-a5ae-70c614769f9d\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.109306 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn6hl\" (UniqueName: \"kubernetes.io/projected/556a0190-2912-4b71-a5ae-70c614769f9d-kube-api-access-tn6hl\") pod \"556a0190-2912-4b71-a5ae-70c614769f9d\" (UID: \"556a0190-2912-4b71-a5ae-70c614769f9d\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.109424 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-trusted-ca\") pod \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.109502 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-operator-metrics\") pod \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\" (UID: \"36a32d28-84e1-4c44-b2e5-546c8a1c8853\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.110375 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-utilities" (OuterVolumeSpecName: "utilities") pod "556a0190-2912-4b71-a5ae-70c614769f9d" (UID: "556a0190-2912-4b71-a5ae-70c614769f9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.110570 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "36a32d28-84e1-4c44-b2e5-546c8a1c8853" (UID: "36a32d28-84e1-4c44-b2e5-546c8a1c8853"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.116786 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "36a32d28-84e1-4c44-b2e5-546c8a1c8853" (UID: "36a32d28-84e1-4c44-b2e5-546c8a1c8853"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.117645 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a32d28-84e1-4c44-b2e5-546c8a1c8853-kube-api-access-pdlxb" (OuterVolumeSpecName: "kube-api-access-pdlxb") pod "36a32d28-84e1-4c44-b2e5-546c8a1c8853" (UID: "36a32d28-84e1-4c44-b2e5-546c8a1c8853"). InnerVolumeSpecName "kube-api-access-pdlxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.117766 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556a0190-2912-4b71-a5ae-70c614769f9d-kube-api-access-tn6hl" (OuterVolumeSpecName: "kube-api-access-tn6hl") pod "556a0190-2912-4b71-a5ae-70c614769f9d" (UID: "556a0190-2912-4b71-a5ae-70c614769f9d"). InnerVolumeSpecName "kube-api-access-tn6hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.117796 4687 scope.go:117] "RemoveContainer" containerID="4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.131246 4687 scope.go:117] "RemoveContainer" containerID="7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.131620 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd\": container with ID starting with 7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd not found: ID does not exist" containerID="7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.131658 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd"} err="failed to get container status \"7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd\": rpc error: code = NotFound desc = could not find container \"7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd\": container with ID starting with 7f2500302ff2b6128c42f68c74c29a77ecd72caf1a43c5e923d347e9e56a59cd not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.131689 4687 scope.go:117] "RemoveContainer" containerID="3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.131955 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f\": container with ID starting with 3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f not found: ID does not exist" containerID="3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.131981 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f"} err="failed to get container status \"3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f\": rpc error: code = NotFound desc = could not find container \"3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f\": container with ID starting with 3553f8a98332f2bab154072ddb689b4c39eeb8ea73c97d2d8a5eb9b85ac11d9f not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.131996 4687 scope.go:117] "RemoveContainer" containerID="4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.132299 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a\": container with ID starting with 4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a not found: ID does not exist" containerID="4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.132329 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a"} err="failed to get container status \"4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a\": rpc error: code = NotFound desc = could not find container \"4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a\": container with ID starting with 4d5a644f693b9d6c4725eb3bdefdb5c105449209e31490049ade8ca2c7770e1a not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.132346 4687 scope.go:117] "RemoveContainer" containerID="85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.146237 4687 scope.go:117] "RemoveContainer" containerID="85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.146800 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985\": container with ID starting with 85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985 not found: ID does not exist" containerID="85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.146867 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985"} err="failed to get container status \"85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985\": rpc error: code = NotFound desc = could not find container \"85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985\": container with ID starting with 85fb5046a4c3a4a2cd21722ef55a6c768b9eedfe5a0d63021990ffdb7d8e6985 not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.146926 4687 scope.go:117] "RemoveContainer" containerID="65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.156818 4687 scope.go:117] "RemoveContainer" containerID="8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.163284 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "556a0190-2912-4b71-a5ae-70c614769f9d" (UID: "556a0190-2912-4b71-a5ae-70c614769f9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.168324 4687 scope.go:117] "RemoveContainer" containerID="6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.182214 4687 scope.go:117] "RemoveContainer" containerID="65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.183208 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736\": container with ID starting with 65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736 not found: ID does not exist" containerID="65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.183283 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736"} err="failed to get container status \"65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736\": rpc error: code = NotFound desc = could not find container \"65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736\": container with ID starting with 65ebb5c1faa7cb593fe34005a85e0adb7db29e458301111b369fead106e6b736 not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.183321 4687 scope.go:117] "RemoveContainer" containerID="8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.183624 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1\": container with ID starting with 8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1 not found: ID does not exist" containerID="8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.183653 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1"} err="failed to get container status \"8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1\": rpc error: code = NotFound desc = could not find container \"8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1\": container with ID starting with 8f9269440387fcd1384a163214cb39f6725a938b641150d79b6f766b30ea3ef1 not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.183675 4687 scope.go:117] "RemoveContainer" containerID="6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.183968 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e\": container with ID starting with 6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e not found: ID does not exist" containerID="6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.184016 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e"} err="failed to get container status \"6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e\": rpc error: code = NotFound desc = could not find container \"6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e\": container with ID starting with 6c06d165e515c771f69082cac703d59cd8f8f4a6f9338df5e86de9318445ca3e not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.184061 4687 scope.go:117] "RemoveContainer" containerID="5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.194907 4687 scope.go:117] "RemoveContainer" containerID="2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.206032 4687 scope.go:117] "RemoveContainer" containerID="fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212057 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzd4r\" (UniqueName: \"kubernetes.io/projected/69eb70ff-d8c7-4dba-9f8e-1969b7947640-kube-api-access-jzd4r\") pod \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212110 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92df9\" (UniqueName: \"kubernetes.io/projected/19def7b9-fb5d-4e49-98db-784814aa9769-kube-api-access-92df9\") pod \"19def7b9-fb5d-4e49-98db-784814aa9769\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212159 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-catalog-content\") pod \"19def7b9-fb5d-4e49-98db-784814aa9769\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212244 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-catalog-content\") pod \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212283 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wgdh\" (UniqueName: \"kubernetes.io/projected/9a9c467e-d2ff-4322-bc25-5cfe38dff784-kube-api-access-2wgdh\") pod \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212310 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-utilities\") pod \"19def7b9-fb5d-4e49-98db-784814aa9769\" (UID: \"19def7b9-fb5d-4e49-98db-784814aa9769\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212343 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-utilities\") pod \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212382 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-utilities\") pod \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\" (UID: \"9a9c467e-d2ff-4322-bc25-5cfe38dff784\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212428 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-catalog-content\") pod \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\" (UID: \"69eb70ff-d8c7-4dba-9f8e-1969b7947640\") " Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212804 4687 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212824 4687 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a32d28-84e1-4c44-b2e5-546c8a1c8853-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212837 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212849 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdlxb\" (UniqueName: \"kubernetes.io/projected/36a32d28-84e1-4c44-b2e5-546c8a1c8853-kube-api-access-pdlxb\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212859 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556a0190-2912-4b71-a5ae-70c614769f9d-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.212869 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn6hl\" (UniqueName: \"kubernetes.io/projected/556a0190-2912-4b71-a5ae-70c614769f9d-kube-api-access-tn6hl\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.213549 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-utilities" (OuterVolumeSpecName: "utilities") pod "19def7b9-fb5d-4e49-98db-784814aa9769" (UID: "19def7b9-fb5d-4e49-98db-784814aa9769"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.214263 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-utilities" (OuterVolumeSpecName: "utilities") pod "9a9c467e-d2ff-4322-bc25-5cfe38dff784" (UID: "9a9c467e-d2ff-4322-bc25-5cfe38dff784"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.214429 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-utilities" (OuterVolumeSpecName: "utilities") pod "69eb70ff-d8c7-4dba-9f8e-1969b7947640" (UID: "69eb70ff-d8c7-4dba-9f8e-1969b7947640"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.216059 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9c467e-d2ff-4322-bc25-5cfe38dff784-kube-api-access-2wgdh" (OuterVolumeSpecName: "kube-api-access-2wgdh") pod "9a9c467e-d2ff-4322-bc25-5cfe38dff784" (UID: "9a9c467e-d2ff-4322-bc25-5cfe38dff784"). InnerVolumeSpecName "kube-api-access-2wgdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.217210 4687 scope.go:117] "RemoveContainer" containerID="5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.217327 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69eb70ff-d8c7-4dba-9f8e-1969b7947640-kube-api-access-jzd4r" (OuterVolumeSpecName: "kube-api-access-jzd4r") pod "69eb70ff-d8c7-4dba-9f8e-1969b7947640" (UID: "69eb70ff-d8c7-4dba-9f8e-1969b7947640"). InnerVolumeSpecName "kube-api-access-jzd4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.217536 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19def7b9-fb5d-4e49-98db-784814aa9769-kube-api-access-92df9" (OuterVolumeSpecName: "kube-api-access-92df9") pod "19def7b9-fb5d-4e49-98db-784814aa9769" (UID: "19def7b9-fb5d-4e49-98db-784814aa9769"). InnerVolumeSpecName "kube-api-access-92df9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.217559 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118\": container with ID starting with 5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118 not found: ID does not exist" containerID="5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.217598 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118"} err="failed to get container status \"5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118\": rpc error: code = NotFound desc = could not find container \"5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118\": container with ID starting with 5d9d0877b1876fa4e990e89ef5e732cdcc8613082ec8187a2a43ebb241cb6118 not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.217627 4687 scope.go:117] "RemoveContainer" containerID="2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.217942 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea\": container with ID starting with 2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea not found: ID does not exist" containerID="2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.217981 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea"} err="failed to get container status \"2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea\": rpc error: code = NotFound desc = could not find container \"2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea\": container with ID starting with 2edf1d7888590d49ff2841dc80d960df8aa15b63e020d11f69adab67340380ea not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.218006 4687 scope.go:117] "RemoveContainer" containerID="fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.218278 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3\": container with ID starting with fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3 not found: ID does not exist" containerID="fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.218306 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3"} err="failed to get container status \"fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3\": rpc error: code = NotFound desc = could not find container \"fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3\": container with ID starting with fd8cd6896f58aec20617dd916cb60d6f4c397640f75a779e71f69a4b89917ac3 not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.218322 4687 scope.go:117] "RemoveContainer" containerID="c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.237196 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a9c467e-d2ff-4322-bc25-5cfe38dff784" (UID: "9a9c467e-d2ff-4322-bc25-5cfe38dff784"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.238224 4687 scope.go:117] "RemoveContainer" containerID="d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.250808 4687 scope.go:117] "RemoveContainer" containerID="8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.261992 4687 scope.go:117] "RemoveContainer" containerID="c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.262292 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a\": container with ID starting with c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a not found: ID does not exist" containerID="c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.262315 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a"} err="failed to get container status \"c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a\": rpc error: code = NotFound desc = could not find container \"c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a\": container with ID starting with c38452a9600a892d59edb8b34ac75c4650369b7663d70bc75b454c7aeb9ca89a not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.262329 4687 scope.go:117] "RemoveContainer" containerID="d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.262560 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894\": container with ID starting with d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894 not found: ID does not exist" containerID="d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.262580 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894"} err="failed to get container status \"d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894\": rpc error: code = NotFound desc = could not find container \"d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894\": container with ID starting with d9e7885e0300d4a9b1f06cbbd1dadd1876b37852eda1388b1b7afe90799ec894 not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.262593 4687 scope.go:117] "RemoveContainer" containerID="8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f" Feb 28 09:08:22 crc kubenswrapper[4687]: E0228 09:08:22.262809 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f\": container with ID starting with 8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f not found: ID does not exist" containerID="8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.262827 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f"} err="failed to get container status \"8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f\": rpc error: code = NotFound desc = could not find container \"8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f\": container with ID starting with 8a1dfa5e933b1a8fb416fa8f82b3279a6d61c79acc1404a02f0b963a11cd365f not found: ID does not exist" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.264757 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19def7b9-fb5d-4e49-98db-784814aa9769" (UID: "19def7b9-fb5d-4e49-98db-784814aa9769"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314102 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzd4r\" (UniqueName: \"kubernetes.io/projected/69eb70ff-d8c7-4dba-9f8e-1969b7947640-kube-api-access-jzd4r\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314130 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92df9\" (UniqueName: \"kubernetes.io/projected/19def7b9-fb5d-4e49-98db-784814aa9769-kube-api-access-92df9\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314142 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314153 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314165 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19def7b9-fb5d-4e49-98db-784814aa9769-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314174 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wgdh\" (UniqueName: \"kubernetes.io/projected/9a9c467e-d2ff-4322-bc25-5cfe38dff784-kube-api-access-2wgdh\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314182 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.314190 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9c467e-d2ff-4322-bc25-5cfe38dff784-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.328177 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69eb70ff-d8c7-4dba-9f8e-1969b7947640" (UID: "69eb70ff-d8c7-4dba-9f8e-1969b7947640"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.351007 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhc57"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.415002 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69eb70ff-d8c7-4dba-9f8e-1969b7947640-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.416171 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5xt25"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.421259 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5xt25"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.424451 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7sr6"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.426681 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7sr6"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.439586 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nkgl2"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.449419 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nkgl2"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.454406 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npwxl"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.460534 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npwxl"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.469364 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svtsw"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.475685 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-svtsw"] Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.662551 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" path="/var/lib/kubelet/pods/19def7b9-fb5d-4e49-98db-784814aa9769/volumes" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.663228 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" path="/var/lib/kubelet/pods/36a32d28-84e1-4c44-b2e5-546c8a1c8853/volumes" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.663675 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" path="/var/lib/kubelet/pods/556a0190-2912-4b71-a5ae-70c614769f9d/volumes" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.664286 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" path="/var/lib/kubelet/pods/69eb70ff-d8c7-4dba-9f8e-1969b7947640/volumes" Feb 28 09:08:22 crc kubenswrapper[4687]: I0228 09:08:22.664869 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" path="/var/lib/kubelet/pods/9a9c467e-d2ff-4322-bc25-5cfe38dff784/volumes" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.100744 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" event={"ID":"e9586004-7da3-41d4-980d-825eafe37f51","Type":"ContainerStarted","Data":"061ce4101475afd838649756dd0dbf2a054f56e7def49bcaeede4f5b7903563f"} Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.101264 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" event={"ID":"e9586004-7da3-41d4-980d-825eafe37f51","Type":"ContainerStarted","Data":"3e5fee0c3a59f69cf962586af6a0037dd703e715a9454073d4cdcd961ac50eb5"} Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.101340 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.105131 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.120357 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qhc57" podStartSLOduration=2.120337534 podStartE2EDuration="2.120337534s" podCreationTimestamp="2026-02-28 09:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:08:23.118567284 +0000 UTC m=+294.809136641" watchObservedRunningTime="2026-02-28 09:08:23.120337534 +0000 UTC m=+294.810906872" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590111 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2ljbw"] Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590320 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590335 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590346 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590351 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590360 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590367 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590373 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590378 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590391 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590396 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590404 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590409 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590417 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590422 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590430 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerName="marketplace-operator" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590437 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerName="marketplace-operator" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590444 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590449 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590457 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590462 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="extract-utilities" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590470 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590475 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="extract-content" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590481 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590486 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: E0228 09:08:23.590496 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590500 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590581 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9c467e-d2ff-4322-bc25-5cfe38dff784" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590591 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="556a0190-2912-4b71-a5ae-70c614769f9d" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590601 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="69eb70ff-d8c7-4dba-9f8e-1969b7947640" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590608 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a32d28-84e1-4c44-b2e5-546c8a1c8853" containerName="marketplace-operator" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.590618 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="19def7b9-fb5d-4e49-98db-784814aa9769" containerName="registry-server" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.591285 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.593037 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.599720 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ljbw"] Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.731881 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d3c50-a212-411b-9c51-4ea3b3fee060-utilities\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.731951 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d3c50-a212-411b-9c51-4ea3b3fee060-catalog-content\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.731990 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z549\" (UniqueName: \"kubernetes.io/projected/5d6d3c50-a212-411b-9c51-4ea3b3fee060-kube-api-access-2z549\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.833980 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d3c50-a212-411b-9c51-4ea3b3fee060-catalog-content\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.834057 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z549\" (UniqueName: \"kubernetes.io/projected/5d6d3c50-a212-411b-9c51-4ea3b3fee060-kube-api-access-2z549\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.834153 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d3c50-a212-411b-9c51-4ea3b3fee060-utilities\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.834667 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d3c50-a212-411b-9c51-4ea3b3fee060-catalog-content\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.834691 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d3c50-a212-411b-9c51-4ea3b3fee060-utilities\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.850690 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z549\" (UniqueName: \"kubernetes.io/projected/5d6d3c50-a212-411b-9c51-4ea3b3fee060-kube-api-access-2z549\") pod \"certified-operators-2ljbw\" (UID: \"5d6d3c50-a212-411b-9c51-4ea3b3fee060\") " pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:23 crc kubenswrapper[4687]: I0228 09:08:23.914086 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.186962 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dcllh"] Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.188288 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.192792 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.199796 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcllh"] Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.302957 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ljbw"] Feb 28 09:08:24 crc kubenswrapper[4687]: W0228 09:08:24.309631 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d6d3c50_a212_411b_9c51_4ea3b3fee060.slice/crio-04f7e15ab5ed3ba4c62c6259fa220b69dbc95ed04868922ea22e6a340cb27025 WatchSource:0}: Error finding container 04f7e15ab5ed3ba4c62c6259fa220b69dbc95ed04868922ea22e6a340cb27025: Status 404 returned error can't find the container with id 04f7e15ab5ed3ba4c62c6259fa220b69dbc95ed04868922ea22e6a340cb27025 Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.342978 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c55d393-9095-4638-b5d0-d6dd60859eb8-utilities\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.343750 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c55d393-9095-4638-b5d0-d6dd60859eb8-catalog-content\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.343792 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68ws9\" (UniqueName: \"kubernetes.io/projected/1c55d393-9095-4638-b5d0-d6dd60859eb8-kube-api-access-68ws9\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.444379 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68ws9\" (UniqueName: \"kubernetes.io/projected/1c55d393-9095-4638-b5d0-d6dd60859eb8-kube-api-access-68ws9\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.444435 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c55d393-9095-4638-b5d0-d6dd60859eb8-utilities\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.444509 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c55d393-9095-4638-b5d0-d6dd60859eb8-catalog-content\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.444937 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c55d393-9095-4638-b5d0-d6dd60859eb8-catalog-content\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.445032 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c55d393-9095-4638-b5d0-d6dd60859eb8-utilities\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.460938 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68ws9\" (UniqueName: \"kubernetes.io/projected/1c55d393-9095-4638-b5d0-d6dd60859eb8-kube-api-access-68ws9\") pod \"redhat-marketplace-dcllh\" (UID: \"1c55d393-9095-4638-b5d0-d6dd60859eb8\") " pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.501682 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:24 crc kubenswrapper[4687]: I0228 09:08:24.886706 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcllh"] Feb 28 09:08:24 crc kubenswrapper[4687]: W0228 09:08:24.893089 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c55d393_9095_4638_b5d0_d6dd60859eb8.slice/crio-7abb341432d7b37f4ee74c35806663ca56b7eb34376c282ea87043de0d05e0f8 WatchSource:0}: Error finding container 7abb341432d7b37f4ee74c35806663ca56b7eb34376c282ea87043de0d05e0f8: Status 404 returned error can't find the container with id 7abb341432d7b37f4ee74c35806663ca56b7eb34376c282ea87043de0d05e0f8 Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.116721 4687 generic.go:334] "Generic (PLEG): container finished" podID="5d6d3c50-a212-411b-9c51-4ea3b3fee060" containerID="cb0efef207563af717e701ce1f2c2dccafc4f733dffd789c84aaf60e072af0ea" exitCode=0 Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.116816 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbw" event={"ID":"5d6d3c50-a212-411b-9c51-4ea3b3fee060","Type":"ContainerDied","Data":"cb0efef207563af717e701ce1f2c2dccafc4f733dffd789c84aaf60e072af0ea"} Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.116859 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbw" event={"ID":"5d6d3c50-a212-411b-9c51-4ea3b3fee060","Type":"ContainerStarted","Data":"04f7e15ab5ed3ba4c62c6259fa220b69dbc95ed04868922ea22e6a340cb27025"} Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.118506 4687 generic.go:334] "Generic (PLEG): container finished" podID="1c55d393-9095-4638-b5d0-d6dd60859eb8" containerID="ad9372ddf201d5acb065d4799902d263325a287812360e8ff1611d6c2bd5a649" exitCode=0 Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.118592 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcllh" event={"ID":"1c55d393-9095-4638-b5d0-d6dd60859eb8","Type":"ContainerDied","Data":"ad9372ddf201d5acb065d4799902d263325a287812360e8ff1611d6c2bd5a649"} Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.118684 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcllh" event={"ID":"1c55d393-9095-4638-b5d0-d6dd60859eb8","Type":"ContainerStarted","Data":"7abb341432d7b37f4ee74c35806663ca56b7eb34376c282ea87043de0d05e0f8"} Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.984312 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xssb6"] Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.985739 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.987518 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 09:08:25 crc kubenswrapper[4687]: I0228 09:08:25.993748 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xssb6"] Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.064577 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgmj\" (UniqueName: \"kubernetes.io/projected/12de48e8-809e-43e9-827f-28ce52d796e8-kube-api-access-gdgmj\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.064627 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12de48e8-809e-43e9-827f-28ce52d796e8-catalog-content\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.064660 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12de48e8-809e-43e9-827f-28ce52d796e8-utilities\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.124122 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbw" event={"ID":"5d6d3c50-a212-411b-9c51-4ea3b3fee060","Type":"ContainerStarted","Data":"e04926f3bfa682e9f78ff3f7434d36113b508e78e76c5c3076bd6d5cc5e05542"} Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.126377 4687 generic.go:334] "Generic (PLEG): container finished" podID="1c55d393-9095-4638-b5d0-d6dd60859eb8" containerID="e2fc415d5f629168aa7c40fa0371e420a365418bd6ff30830adee17117ed02dd" exitCode=0 Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.126412 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcllh" event={"ID":"1c55d393-9095-4638-b5d0-d6dd60859eb8","Type":"ContainerDied","Data":"e2fc415d5f629168aa7c40fa0371e420a365418bd6ff30830adee17117ed02dd"} Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.165727 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgmj\" (UniqueName: \"kubernetes.io/projected/12de48e8-809e-43e9-827f-28ce52d796e8-kube-api-access-gdgmj\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.165765 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12de48e8-809e-43e9-827f-28ce52d796e8-catalog-content\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.165798 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12de48e8-809e-43e9-827f-28ce52d796e8-utilities\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.166266 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12de48e8-809e-43e9-827f-28ce52d796e8-utilities\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.166406 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12de48e8-809e-43e9-827f-28ce52d796e8-catalog-content\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.181550 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgmj\" (UniqueName: \"kubernetes.io/projected/12de48e8-809e-43e9-827f-28ce52d796e8-kube-api-access-gdgmj\") pod \"redhat-operators-xssb6\" (UID: \"12de48e8-809e-43e9-827f-28ce52d796e8\") " pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.301339 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.456117 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xssb6"] Feb 28 09:08:26 crc kubenswrapper[4687]: W0228 09:08:26.461359 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12de48e8_809e_43e9_827f_28ce52d796e8.slice/crio-0aad2cc267ad1507ec06bf083990380b52ee97d615d81c1e57dacefcfc4594eb WatchSource:0}: Error finding container 0aad2cc267ad1507ec06bf083990380b52ee97d615d81c1e57dacefcfc4594eb: Status 404 returned error can't find the container with id 0aad2cc267ad1507ec06bf083990380b52ee97d615d81c1e57dacefcfc4594eb Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.598843 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99h7q"] Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.599939 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.601279 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.602896 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99h7q"] Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.780769 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43862b0c-fb60-45f2-b4bd-0e09864292a9-catalog-content\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.780847 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cn7m\" (UniqueName: \"kubernetes.io/projected/43862b0c-fb60-45f2-b4bd-0e09864292a9-kube-api-access-7cn7m\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.780873 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43862b0c-fb60-45f2-b4bd-0e09864292a9-utilities\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.882434 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43862b0c-fb60-45f2-b4bd-0e09864292a9-catalog-content\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.882797 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cn7m\" (UniqueName: \"kubernetes.io/projected/43862b0c-fb60-45f2-b4bd-0e09864292a9-kube-api-access-7cn7m\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.882839 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43862b0c-fb60-45f2-b4bd-0e09864292a9-utilities\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.882856 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43862b0c-fb60-45f2-b4bd-0e09864292a9-catalog-content\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.883204 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43862b0c-fb60-45f2-b4bd-0e09864292a9-utilities\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.901922 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cn7m\" (UniqueName: \"kubernetes.io/projected/43862b0c-fb60-45f2-b4bd-0e09864292a9-kube-api-access-7cn7m\") pod \"community-operators-99h7q\" (UID: \"43862b0c-fb60-45f2-b4bd-0e09864292a9\") " pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:26 crc kubenswrapper[4687]: I0228 09:08:26.966184 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.135617 4687 generic.go:334] "Generic (PLEG): container finished" podID="5d6d3c50-a212-411b-9c51-4ea3b3fee060" containerID="e04926f3bfa682e9f78ff3f7434d36113b508e78e76c5c3076bd6d5cc5e05542" exitCode=0 Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.135698 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbw" event={"ID":"5d6d3c50-a212-411b-9c51-4ea3b3fee060","Type":"ContainerDied","Data":"e04926f3bfa682e9f78ff3f7434d36113b508e78e76c5c3076bd6d5cc5e05542"} Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.140734 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcllh" event={"ID":"1c55d393-9095-4638-b5d0-d6dd60859eb8","Type":"ContainerStarted","Data":"8386b04c87319fb56ac24e09863bc9cd2206e56e94af1172f2ec77065be9f59a"} Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.143062 4687 generic.go:334] "Generic (PLEG): container finished" podID="12de48e8-809e-43e9-827f-28ce52d796e8" containerID="65638452745c29a860d9d8e724656dadcb8b19ca9400f330198a5692a9ef16d3" exitCode=0 Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.143104 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xssb6" event={"ID":"12de48e8-809e-43e9-827f-28ce52d796e8","Type":"ContainerDied","Data":"65638452745c29a860d9d8e724656dadcb8b19ca9400f330198a5692a9ef16d3"} Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.143124 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xssb6" event={"ID":"12de48e8-809e-43e9-827f-28ce52d796e8","Type":"ContainerStarted","Data":"0aad2cc267ad1507ec06bf083990380b52ee97d615d81c1e57dacefcfc4594eb"} Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.183232 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dcllh" podStartSLOduration=1.642474343 podStartE2EDuration="3.183211075s" podCreationTimestamp="2026-02-28 09:08:24 +0000 UTC" firstStartedPulling="2026-02-28 09:08:25.119723189 +0000 UTC m=+296.810292526" lastFinishedPulling="2026-02-28 09:08:26.660459922 +0000 UTC m=+298.351029258" observedRunningTime="2026-02-28 09:08:27.181481021 +0000 UTC m=+298.872050359" watchObservedRunningTime="2026-02-28 09:08:27.183211075 +0000 UTC m=+298.873780413" Feb 28 09:08:27 crc kubenswrapper[4687]: I0228 09:08:27.359815 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99h7q"] Feb 28 09:08:27 crc kubenswrapper[4687]: W0228 09:08:27.363859 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43862b0c_fb60_45f2_b4bd_0e09864292a9.slice/crio-18875b411c42a1574a626e8ebae7b09cc34ff8f01994f2e3d1fc80b422f4a504 WatchSource:0}: Error finding container 18875b411c42a1574a626e8ebae7b09cc34ff8f01994f2e3d1fc80b422f4a504: Status 404 returned error can't find the container with id 18875b411c42a1574a626e8ebae7b09cc34ff8f01994f2e3d1fc80b422f4a504 Feb 28 09:08:28 crc kubenswrapper[4687]: I0228 09:08:28.151176 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ljbw" event={"ID":"5d6d3c50-a212-411b-9c51-4ea3b3fee060","Type":"ContainerStarted","Data":"72e3e974017209cc8eb8ab74874477773d2b9c7f484f7ba81022065cefd98c8d"} Feb 28 09:08:28 crc kubenswrapper[4687]: I0228 09:08:28.153504 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xssb6" event={"ID":"12de48e8-809e-43e9-827f-28ce52d796e8","Type":"ContainerStarted","Data":"f33ecfcf936daaf79537aaf32b63a77a19b1024e386c0f23e85a4ba39ab6fc5c"} Feb 28 09:08:28 crc kubenswrapper[4687]: I0228 09:08:28.155974 4687 generic.go:334] "Generic (PLEG): container finished" podID="43862b0c-fb60-45f2-b4bd-0e09864292a9" containerID="623dd2ae9d0cb573e15e4ecfcc7b256357f502dc95691e7b1471fb34e28cf66a" exitCode=0 Feb 28 09:08:28 crc kubenswrapper[4687]: I0228 09:08:28.156799 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99h7q" event={"ID":"43862b0c-fb60-45f2-b4bd-0e09864292a9","Type":"ContainerDied","Data":"623dd2ae9d0cb573e15e4ecfcc7b256357f502dc95691e7b1471fb34e28cf66a"} Feb 28 09:08:28 crc kubenswrapper[4687]: I0228 09:08:28.156824 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99h7q" event={"ID":"43862b0c-fb60-45f2-b4bd-0e09864292a9","Type":"ContainerStarted","Data":"18875b411c42a1574a626e8ebae7b09cc34ff8f01994f2e3d1fc80b422f4a504"} Feb 28 09:08:28 crc kubenswrapper[4687]: I0228 09:08:28.171776 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2ljbw" podStartSLOduration=2.601361462 podStartE2EDuration="5.171764836s" podCreationTimestamp="2026-02-28 09:08:23 +0000 UTC" firstStartedPulling="2026-02-28 09:08:25.118191288 +0000 UTC m=+296.808760625" lastFinishedPulling="2026-02-28 09:08:27.688594662 +0000 UTC m=+299.379163999" observedRunningTime="2026-02-28 09:08:28.169181937 +0000 UTC m=+299.859751275" watchObservedRunningTime="2026-02-28 09:08:28.171764836 +0000 UTC m=+299.862334174" Feb 28 09:08:29 crc kubenswrapper[4687]: I0228 09:08:29.163103 4687 generic.go:334] "Generic (PLEG): container finished" podID="43862b0c-fb60-45f2-b4bd-0e09864292a9" containerID="984d63e6e75bbcaba310cdae449f8d261e6066c44b0cef3a5676abd6c7f0048e" exitCode=0 Feb 28 09:08:29 crc kubenswrapper[4687]: I0228 09:08:29.163191 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99h7q" event={"ID":"43862b0c-fb60-45f2-b4bd-0e09864292a9","Type":"ContainerDied","Data":"984d63e6e75bbcaba310cdae449f8d261e6066c44b0cef3a5676abd6c7f0048e"} Feb 28 09:08:29 crc kubenswrapper[4687]: I0228 09:08:29.165287 4687 generic.go:334] "Generic (PLEG): container finished" podID="12de48e8-809e-43e9-827f-28ce52d796e8" containerID="f33ecfcf936daaf79537aaf32b63a77a19b1024e386c0f23e85a4ba39ab6fc5c" exitCode=0 Feb 28 09:08:29 crc kubenswrapper[4687]: I0228 09:08:29.165952 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xssb6" event={"ID":"12de48e8-809e-43e9-827f-28ce52d796e8","Type":"ContainerDied","Data":"f33ecfcf936daaf79537aaf32b63a77a19b1024e386c0f23e85a4ba39ab6fc5c"} Feb 28 09:08:30 crc kubenswrapper[4687]: I0228 09:08:30.176574 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xssb6" event={"ID":"12de48e8-809e-43e9-827f-28ce52d796e8","Type":"ContainerStarted","Data":"57ed74d5c5fa6c2d9612c1db356f893173065fb654ce56c8f4a2f3d869cbf797"} Feb 28 09:08:30 crc kubenswrapper[4687]: I0228 09:08:30.178884 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99h7q" event={"ID":"43862b0c-fb60-45f2-b4bd-0e09864292a9","Type":"ContainerStarted","Data":"599b5fb313cd7871cfa3af27d24ade004393bdbd3779de8211f92d9ac4eea872"} Feb 28 09:08:30 crc kubenswrapper[4687]: I0228 09:08:30.197325 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xssb6" podStartSLOduration=2.649365694 podStartE2EDuration="5.197298133s" podCreationTimestamp="2026-02-28 09:08:25 +0000 UTC" firstStartedPulling="2026-02-28 09:08:27.144644264 +0000 UTC m=+298.835213601" lastFinishedPulling="2026-02-28 09:08:29.692576702 +0000 UTC m=+301.383146040" observedRunningTime="2026-02-28 09:08:30.194949846 +0000 UTC m=+301.885519183" watchObservedRunningTime="2026-02-28 09:08:30.197298133 +0000 UTC m=+301.887867470" Feb 28 09:08:30 crc kubenswrapper[4687]: I0228 09:08:30.220922 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99h7q" podStartSLOduration=2.650289603 podStartE2EDuration="4.220889827s" podCreationTimestamp="2026-02-28 09:08:26 +0000 UTC" firstStartedPulling="2026-02-28 09:08:28.157559648 +0000 UTC m=+299.848128984" lastFinishedPulling="2026-02-28 09:08:29.728159871 +0000 UTC m=+301.418729208" observedRunningTime="2026-02-28 09:08:30.216390153 +0000 UTC m=+301.906959490" watchObservedRunningTime="2026-02-28 09:08:30.220889827 +0000 UTC m=+301.911459164" Feb 28 09:08:33 crc kubenswrapper[4687]: I0228 09:08:33.914619 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:33 crc kubenswrapper[4687]: I0228 09:08:33.915405 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:33 crc kubenswrapper[4687]: I0228 09:08:33.954563 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:34 crc kubenswrapper[4687]: I0228 09:08:34.243171 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2ljbw" Feb 28 09:08:34 crc kubenswrapper[4687]: I0228 09:08:34.502876 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:34 crc kubenswrapper[4687]: I0228 09:08:34.502956 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:34 crc kubenswrapper[4687]: I0228 09:08:34.535340 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:35 crc kubenswrapper[4687]: I0228 09:08:35.238888 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dcllh" Feb 28 09:08:36 crc kubenswrapper[4687]: I0228 09:08:36.302495 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:36 crc kubenswrapper[4687]: I0228 09:08:36.302564 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:36 crc kubenswrapper[4687]: I0228 09:08:36.335064 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:08:36 crc kubenswrapper[4687]: I0228 09:08:36.966607 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:36 crc kubenswrapper[4687]: I0228 09:08:36.966934 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:36 crc kubenswrapper[4687]: I0228 09:08:36.997275 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:37 crc kubenswrapper[4687]: I0228 09:08:37.243192 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99h7q" Feb 28 09:08:37 crc kubenswrapper[4687]: I0228 09:08:37.244742 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xssb6" Feb 28 09:09:55 crc kubenswrapper[4687]: I0228 09:09:55.002433 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:09:55 crc kubenswrapper[4687]: I0228 09:09:55.003015 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.132240 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537830-99xt9"] Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.133417 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.135397 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.136147 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.137885 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537830-99xt9"] Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.139010 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.203121 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlfns\" (UniqueName: \"kubernetes.io/projected/aca959e0-750a-4677-ab35-59ff3b0c6d5b-kube-api-access-xlfns\") pod \"auto-csr-approver-29537830-99xt9\" (UID: \"aca959e0-750a-4677-ab35-59ff3b0c6d5b\") " pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.304347 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlfns\" (UniqueName: \"kubernetes.io/projected/aca959e0-750a-4677-ab35-59ff3b0c6d5b-kube-api-access-xlfns\") pod \"auto-csr-approver-29537830-99xt9\" (UID: \"aca959e0-750a-4677-ab35-59ff3b0c6d5b\") " pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.324274 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlfns\" (UniqueName: \"kubernetes.io/projected/aca959e0-750a-4677-ab35-59ff3b0c6d5b-kube-api-access-xlfns\") pod \"auto-csr-approver-29537830-99xt9\" (UID: \"aca959e0-750a-4677-ab35-59ff3b0c6d5b\") " pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.447031 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:00 crc kubenswrapper[4687]: I0228 09:10:00.796150 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537830-99xt9"] Feb 28 09:10:01 crc kubenswrapper[4687]: I0228 09:10:01.617074 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537830-99xt9" event={"ID":"aca959e0-750a-4677-ab35-59ff3b0c6d5b","Type":"ContainerStarted","Data":"e51d7cd20a18473b549e84e4142eb4690cd19eb3886f32684cae9d94135e5911"} Feb 28 09:10:02 crc kubenswrapper[4687]: I0228 09:10:02.623605 4687 generic.go:334] "Generic (PLEG): container finished" podID="aca959e0-750a-4677-ab35-59ff3b0c6d5b" containerID="b6499e19e98bfef81b679942402139ecdcad9fa0f60e1cabdc12729e3c1393c6" exitCode=0 Feb 28 09:10:02 crc kubenswrapper[4687]: I0228 09:10:02.623691 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537830-99xt9" event={"ID":"aca959e0-750a-4677-ab35-59ff3b0c6d5b","Type":"ContainerDied","Data":"b6499e19e98bfef81b679942402139ecdcad9fa0f60e1cabdc12729e3c1393c6"} Feb 28 09:10:03 crc kubenswrapper[4687]: I0228 09:10:03.805679 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:03 crc kubenswrapper[4687]: I0228 09:10:03.852893 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlfns\" (UniqueName: \"kubernetes.io/projected/aca959e0-750a-4677-ab35-59ff3b0c6d5b-kube-api-access-xlfns\") pod \"aca959e0-750a-4677-ab35-59ff3b0c6d5b\" (UID: \"aca959e0-750a-4677-ab35-59ff3b0c6d5b\") " Feb 28 09:10:03 crc kubenswrapper[4687]: I0228 09:10:03.858497 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca959e0-750a-4677-ab35-59ff3b0c6d5b-kube-api-access-xlfns" (OuterVolumeSpecName: "kube-api-access-xlfns") pod "aca959e0-750a-4677-ab35-59ff3b0c6d5b" (UID: "aca959e0-750a-4677-ab35-59ff3b0c6d5b"). InnerVolumeSpecName "kube-api-access-xlfns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:10:03 crc kubenswrapper[4687]: I0228 09:10:03.954473 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlfns\" (UniqueName: \"kubernetes.io/projected/aca959e0-750a-4677-ab35-59ff3b0c6d5b-kube-api-access-xlfns\") on node \"crc\" DevicePath \"\"" Feb 28 09:10:04 crc kubenswrapper[4687]: I0228 09:10:04.635109 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537830-99xt9" event={"ID":"aca959e0-750a-4677-ab35-59ff3b0c6d5b","Type":"ContainerDied","Data":"e51d7cd20a18473b549e84e4142eb4690cd19eb3886f32684cae9d94135e5911"} Feb 28 09:10:04 crc kubenswrapper[4687]: I0228 09:10:04.635169 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e51d7cd20a18473b549e84e4142eb4690cd19eb3886f32684cae9d94135e5911" Feb 28 09:10:04 crc kubenswrapper[4687]: I0228 09:10:04.635480 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537830-99xt9" Feb 28 09:10:25 crc kubenswrapper[4687]: I0228 09:10:25.002430 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:10:25 crc kubenswrapper[4687]: I0228 09:10:25.003074 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.002504 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.003352 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.003433 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.004268 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fad4ec2f45b132fa1fcbba9b5a4c5891531193748a3177bf121c290113487ba4"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.004345 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://fad4ec2f45b132fa1fcbba9b5a4c5891531193748a3177bf121c290113487ba4" gracePeriod=600 Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.892368 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="fad4ec2f45b132fa1fcbba9b5a4c5891531193748a3177bf121c290113487ba4" exitCode=0 Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.892442 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"fad4ec2f45b132fa1fcbba9b5a4c5891531193748a3177bf121c290113487ba4"} Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.892724 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"bcbde49ebdbfb08d03f55668dbe45e77e9c15c2d2f6e5cdcc206fabca01051bf"} Feb 28 09:10:55 crc kubenswrapper[4687]: I0228 09:10:55.892762 4687 scope.go:117] "RemoveContainer" containerID="a4fa09ae345698d6959b87a651d6646b2e144c55db675e36a768b83892b2c64d" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.125193 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537832-gnr45"] Feb 28 09:12:00 crc kubenswrapper[4687]: E0228 09:12:00.125905 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca959e0-750a-4677-ab35-59ff3b0c6d5b" containerName="oc" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.125918 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca959e0-750a-4677-ab35-59ff3b0c6d5b" containerName="oc" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.126015 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca959e0-750a-4677-ab35-59ff3b0c6d5b" containerName="oc" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.126417 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.127682 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.128084 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.128228 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.131058 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcpq\" (UniqueName: \"kubernetes.io/projected/fd97b565-7b98-4a48-8ff8-ed015b66da45-kube-api-access-fgcpq\") pod \"auto-csr-approver-29537832-gnr45\" (UID: \"fd97b565-7b98-4a48-8ff8-ed015b66da45\") " pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.144402 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537832-gnr45"] Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.231835 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcpq\" (UniqueName: \"kubernetes.io/projected/fd97b565-7b98-4a48-8ff8-ed015b66da45-kube-api-access-fgcpq\") pod \"auto-csr-approver-29537832-gnr45\" (UID: \"fd97b565-7b98-4a48-8ff8-ed015b66da45\") " pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.250643 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcpq\" (UniqueName: \"kubernetes.io/projected/fd97b565-7b98-4a48-8ff8-ed015b66da45-kube-api-access-fgcpq\") pod \"auto-csr-approver-29537832-gnr45\" (UID: \"fd97b565-7b98-4a48-8ff8-ed015b66da45\") " pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.452053 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.592601 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537832-gnr45"] Feb 28 09:12:00 crc kubenswrapper[4687]: I0228 09:12:00.600462 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:12:01 crc kubenswrapper[4687]: I0228 09:12:01.218824 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537832-gnr45" event={"ID":"fd97b565-7b98-4a48-8ff8-ed015b66da45","Type":"ContainerStarted","Data":"53fab24c2c419507361c608eca701c0575258f1f965709796b93ff30245389dd"} Feb 28 09:12:02 crc kubenswrapper[4687]: I0228 09:12:02.224460 4687 generic.go:334] "Generic (PLEG): container finished" podID="fd97b565-7b98-4a48-8ff8-ed015b66da45" containerID="88cf144858170a73b2f4fcb48fce7f766c95fd90081524d373422e858474102b" exitCode=0 Feb 28 09:12:02 crc kubenswrapper[4687]: I0228 09:12:02.224514 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537832-gnr45" event={"ID":"fd97b565-7b98-4a48-8ff8-ed015b66da45","Type":"ContainerDied","Data":"88cf144858170a73b2f4fcb48fce7f766c95fd90081524d373422e858474102b"} Feb 28 09:12:03 crc kubenswrapper[4687]: I0228 09:12:03.397207 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:03 crc kubenswrapper[4687]: I0228 09:12:03.472933 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgcpq\" (UniqueName: \"kubernetes.io/projected/fd97b565-7b98-4a48-8ff8-ed015b66da45-kube-api-access-fgcpq\") pod \"fd97b565-7b98-4a48-8ff8-ed015b66da45\" (UID: \"fd97b565-7b98-4a48-8ff8-ed015b66da45\") " Feb 28 09:12:03 crc kubenswrapper[4687]: I0228 09:12:03.477872 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd97b565-7b98-4a48-8ff8-ed015b66da45-kube-api-access-fgcpq" (OuterVolumeSpecName: "kube-api-access-fgcpq") pod "fd97b565-7b98-4a48-8ff8-ed015b66da45" (UID: "fd97b565-7b98-4a48-8ff8-ed015b66da45"). InnerVolumeSpecName "kube-api-access-fgcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:12:03 crc kubenswrapper[4687]: I0228 09:12:03.574692 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgcpq\" (UniqueName: \"kubernetes.io/projected/fd97b565-7b98-4a48-8ff8-ed015b66da45-kube-api-access-fgcpq\") on node \"crc\" DevicePath \"\"" Feb 28 09:12:04 crc kubenswrapper[4687]: I0228 09:12:04.233977 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537832-gnr45" event={"ID":"fd97b565-7b98-4a48-8ff8-ed015b66da45","Type":"ContainerDied","Data":"53fab24c2c419507361c608eca701c0575258f1f965709796b93ff30245389dd"} Feb 28 09:12:04 crc kubenswrapper[4687]: I0228 09:12:04.234061 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537832-gnr45" Feb 28 09:12:04 crc kubenswrapper[4687]: I0228 09:12:04.234074 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53fab24c2c419507361c608eca701c0575258f1f965709796b93ff30245389dd" Feb 28 09:12:04 crc kubenswrapper[4687]: I0228 09:12:04.438113 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537826-t7ghx"] Feb 28 09:12:04 crc kubenswrapper[4687]: I0228 09:12:04.439706 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537826-t7ghx"] Feb 28 09:12:04 crc kubenswrapper[4687]: I0228 09:12:04.662171 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbc5046-5a52-46b8-8d92-45e22891bd1d" path="/var/lib/kubelet/pods/ecbc5046-5a52-46b8-8d92-45e22891bd1d/volumes" Feb 28 09:12:37 crc kubenswrapper[4687]: I0228 09:12:37.338566 4687 scope.go:117] "RemoveContainer" containerID="0f446d8f8117a09c10f580b771adb69dacf34cd28c37ea0c9ccc12f3d3093445" Feb 28 09:12:55 crc kubenswrapper[4687]: I0228 09:12:55.002823 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:12:55 crc kubenswrapper[4687]: I0228 09:12:55.003438 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.265690 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-grvn5"] Feb 28 09:13:03 crc kubenswrapper[4687]: E0228 09:13:03.266131 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd97b565-7b98-4a48-8ff8-ed015b66da45" containerName="oc" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.266144 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd97b565-7b98-4a48-8ff8-ed015b66da45" containerName="oc" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.266240 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd97b565-7b98-4a48-8ff8-ed015b66da45" containerName="oc" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.266555 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.286460 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-grvn5"] Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.452927 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09b94d90-e238-4938-b27f-dde65f7e831b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453177 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-bound-sa-token\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453294 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453385 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-registry-tls\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453477 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09b94d90-e238-4938-b27f-dde65f7e831b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453575 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrb2t\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-kube-api-access-jrb2t\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453659 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09b94d90-e238-4938-b27f-dde65f7e831b-registry-certificates\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.453752 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09b94d90-e238-4938-b27f-dde65f7e831b-trusted-ca\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.474184 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554435 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09b94d90-e238-4938-b27f-dde65f7e831b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554484 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-bound-sa-token\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554512 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-registry-tls\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554543 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09b94d90-e238-4938-b27f-dde65f7e831b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554589 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrb2t\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-kube-api-access-jrb2t\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554617 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09b94d90-e238-4938-b27f-dde65f7e831b-registry-certificates\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.554643 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09b94d90-e238-4938-b27f-dde65f7e831b-trusted-ca\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.555301 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09b94d90-e238-4938-b27f-dde65f7e831b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.556144 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09b94d90-e238-4938-b27f-dde65f7e831b-registry-certificates\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.556539 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09b94d90-e238-4938-b27f-dde65f7e831b-trusted-ca\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.561449 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09b94d90-e238-4938-b27f-dde65f7e831b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.562568 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-registry-tls\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.569745 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-bound-sa-token\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.570038 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrb2t\" (UniqueName: \"kubernetes.io/projected/09b94d90-e238-4938-b27f-dde65f7e831b-kube-api-access-jrb2t\") pod \"image-registry-66df7c8f76-grvn5\" (UID: \"09b94d90-e238-4938-b27f-dde65f7e831b\") " pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.578840 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:03 crc kubenswrapper[4687]: I0228 09:13:03.934227 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-grvn5"] Feb 28 09:13:04 crc kubenswrapper[4687]: I0228 09:13:04.567096 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" event={"ID":"09b94d90-e238-4938-b27f-dde65f7e831b","Type":"ContainerStarted","Data":"b787ce4e3f2354ba62990b0378e167136b83e96e55bdd6525bd136502eb4b0a5"} Feb 28 09:13:04 crc kubenswrapper[4687]: I0228 09:13:04.567433 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:04 crc kubenswrapper[4687]: I0228 09:13:04.567444 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" event={"ID":"09b94d90-e238-4938-b27f-dde65f7e831b","Type":"ContainerStarted","Data":"9bec1d11a5136d2d4e41bada5df2691fa287f58038b48667450739e29c802d88"} Feb 28 09:13:04 crc kubenswrapper[4687]: I0228 09:13:04.583121 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" podStartSLOduration=1.583107824 podStartE2EDuration="1.583107824s" podCreationTimestamp="2026-02-28 09:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:13:04.580079821 +0000 UTC m=+576.270649158" watchObservedRunningTime="2026-02-28 09:13:04.583107824 +0000 UTC m=+576.273677161" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.534983 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-jrbfz"] Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.535893 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jrbfz" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.537672 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw"] Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.538141 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.538419 4687 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-69mrs" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.539398 4687 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lrps9" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.539559 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.539708 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.540155 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw"] Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.542263 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jrbfz"] Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.546874 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-h6sww"] Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.547467 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.548779 4687 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qdh66" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.556655 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-h6sww"] Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.653700 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgr5\" (UniqueName: \"kubernetes.io/projected/5b4222a9-1f7a-48de-879a-4c5dc9d4d99d-kube-api-access-2tgr5\") pod \"cert-manager-858654f9db-jrbfz\" (UID: \"5b4222a9-1f7a-48de-879a-4c5dc9d4d99d\") " pod="cert-manager/cert-manager-858654f9db-jrbfz" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.653745 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x28z\" (UniqueName: \"kubernetes.io/projected/45a33d4f-01db-48af-aa18-b0a18834a9ab-kube-api-access-8x28z\") pod \"cert-manager-cainjector-cf98fcc89-gg5lw\" (UID: \"45a33d4f-01db-48af-aa18-b0a18834a9ab\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.653792 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7gl\" (UniqueName: \"kubernetes.io/projected/42c2a835-9620-4ed3-8dc5-dbe24b201af7-kube-api-access-6z7gl\") pod \"cert-manager-webhook-687f57d79b-h6sww\" (UID: \"42c2a835-9620-4ed3-8dc5-dbe24b201af7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.755061 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7gl\" (UniqueName: \"kubernetes.io/projected/42c2a835-9620-4ed3-8dc5-dbe24b201af7-kube-api-access-6z7gl\") pod \"cert-manager-webhook-687f57d79b-h6sww\" (UID: \"42c2a835-9620-4ed3-8dc5-dbe24b201af7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.755175 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgr5\" (UniqueName: \"kubernetes.io/projected/5b4222a9-1f7a-48de-879a-4c5dc9d4d99d-kube-api-access-2tgr5\") pod \"cert-manager-858654f9db-jrbfz\" (UID: \"5b4222a9-1f7a-48de-879a-4c5dc9d4d99d\") " pod="cert-manager/cert-manager-858654f9db-jrbfz" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.755207 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x28z\" (UniqueName: \"kubernetes.io/projected/45a33d4f-01db-48af-aa18-b0a18834a9ab-kube-api-access-8x28z\") pod \"cert-manager-cainjector-cf98fcc89-gg5lw\" (UID: \"45a33d4f-01db-48af-aa18-b0a18834a9ab\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.772084 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x28z\" (UniqueName: \"kubernetes.io/projected/45a33d4f-01db-48af-aa18-b0a18834a9ab-kube-api-access-8x28z\") pod \"cert-manager-cainjector-cf98fcc89-gg5lw\" (UID: \"45a33d4f-01db-48af-aa18-b0a18834a9ab\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.772589 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgr5\" (UniqueName: \"kubernetes.io/projected/5b4222a9-1f7a-48de-879a-4c5dc9d4d99d-kube-api-access-2tgr5\") pod \"cert-manager-858654f9db-jrbfz\" (UID: \"5b4222a9-1f7a-48de-879a-4c5dc9d4d99d\") " pod="cert-manager/cert-manager-858654f9db-jrbfz" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.772961 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7gl\" (UniqueName: \"kubernetes.io/projected/42c2a835-9620-4ed3-8dc5-dbe24b201af7-kube-api-access-6z7gl\") pod \"cert-manager-webhook-687f57d79b-h6sww\" (UID: \"42c2a835-9620-4ed3-8dc5-dbe24b201af7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.849838 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jrbfz" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.858418 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.864324 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:11 crc kubenswrapper[4687]: I0228 09:13:11.993983 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jrbfz"] Feb 28 09:13:11 crc kubenswrapper[4687]: W0228 09:13:11.999001 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b4222a9_1f7a_48de_879a_4c5dc9d4d99d.slice/crio-3713eb9dc5e8daf538385bd7ccf7bc15cac140b257dcb8f790367b6bbbc925d5 WatchSource:0}: Error finding container 3713eb9dc5e8daf538385bd7ccf7bc15cac140b257dcb8f790367b6bbbc925d5: Status 404 returned error can't find the container with id 3713eb9dc5e8daf538385bd7ccf7bc15cac140b257dcb8f790367b6bbbc925d5 Feb 28 09:13:12 crc kubenswrapper[4687]: I0228 09:13:12.241049 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw"] Feb 28 09:13:12 crc kubenswrapper[4687]: I0228 09:13:12.244120 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-h6sww"] Feb 28 09:13:12 crc kubenswrapper[4687]: W0228 09:13:12.244834 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c2a835_9620_4ed3_8dc5_dbe24b201af7.slice/crio-7d42bbac8e67d3e742d291b144967909c3c02353ebe2ef9ac32e162b7d6d4a44 WatchSource:0}: Error finding container 7d42bbac8e67d3e742d291b144967909c3c02353ebe2ef9ac32e162b7d6d4a44: Status 404 returned error can't find the container with id 7d42bbac8e67d3e742d291b144967909c3c02353ebe2ef9ac32e162b7d6d4a44 Feb 28 09:13:12 crc kubenswrapper[4687]: W0228 09:13:12.246624 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45a33d4f_01db_48af_aa18_b0a18834a9ab.slice/crio-d926617fc76d0d1c46e5acf7f699a6295521cd1574fe534158c083a8cdf4d687 WatchSource:0}: Error finding container d926617fc76d0d1c46e5acf7f699a6295521cd1574fe534158c083a8cdf4d687: Status 404 returned error can't find the container with id d926617fc76d0d1c46e5acf7f699a6295521cd1574fe534158c083a8cdf4d687 Feb 28 09:13:12 crc kubenswrapper[4687]: I0228 09:13:12.600837 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" event={"ID":"42c2a835-9620-4ed3-8dc5-dbe24b201af7","Type":"ContainerStarted","Data":"7d42bbac8e67d3e742d291b144967909c3c02353ebe2ef9ac32e162b7d6d4a44"} Feb 28 09:13:12 crc kubenswrapper[4687]: I0228 09:13:12.601878 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jrbfz" event={"ID":"5b4222a9-1f7a-48de-879a-4c5dc9d4d99d","Type":"ContainerStarted","Data":"3713eb9dc5e8daf538385bd7ccf7bc15cac140b257dcb8f790367b6bbbc925d5"} Feb 28 09:13:12 crc kubenswrapper[4687]: I0228 09:13:12.602804 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" event={"ID":"45a33d4f-01db-48af-aa18-b0a18834a9ab","Type":"ContainerStarted","Data":"d926617fc76d0d1c46e5acf7f699a6295521cd1574fe534158c083a8cdf4d687"} Feb 28 09:13:14 crc kubenswrapper[4687]: I0228 09:13:14.616880 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jrbfz" event={"ID":"5b4222a9-1f7a-48de-879a-4c5dc9d4d99d","Type":"ContainerStarted","Data":"84a10a21ceaf5be04d7087d96a8e989957fe2f8dccc53a4598b5c7406c3bb4af"} Feb 28 09:13:14 crc kubenswrapper[4687]: I0228 09:13:14.634695 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-jrbfz" podStartSLOduration=1.698829172 podStartE2EDuration="3.634682141s" podCreationTimestamp="2026-02-28 09:13:11 +0000 UTC" firstStartedPulling="2026-02-28 09:13:12.002158635 +0000 UTC m=+583.692727972" lastFinishedPulling="2026-02-28 09:13:13.938011605 +0000 UTC m=+585.628580941" observedRunningTime="2026-02-28 09:13:14.633095377 +0000 UTC m=+586.323664724" watchObservedRunningTime="2026-02-28 09:13:14.634682141 +0000 UTC m=+586.325251477" Feb 28 09:13:15 crc kubenswrapper[4687]: I0228 09:13:15.622780 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" event={"ID":"42c2a835-9620-4ed3-8dc5-dbe24b201af7","Type":"ContainerStarted","Data":"b7e3901047b5cf5793c540ac86970c9de63ce43e6f67715e9d7764232f432c14"} Feb 28 09:13:15 crc kubenswrapper[4687]: I0228 09:13:15.623171 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:15 crc kubenswrapper[4687]: I0228 09:13:15.624698 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" event={"ID":"45a33d4f-01db-48af-aa18-b0a18834a9ab","Type":"ContainerStarted","Data":"153cbfaa060a7b1d58fa76e0fc19fff6a02ed1beebfd355f6e9dfb90bac2c220"} Feb 28 09:13:15 crc kubenswrapper[4687]: I0228 09:13:15.637861 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" podStartSLOduration=1.697304573 podStartE2EDuration="4.637834225s" podCreationTimestamp="2026-02-28 09:13:11 +0000 UTC" firstStartedPulling="2026-02-28 09:13:12.246726942 +0000 UTC m=+583.937296279" lastFinishedPulling="2026-02-28 09:13:15.187256593 +0000 UTC m=+586.877825931" observedRunningTime="2026-02-28 09:13:15.637766979 +0000 UTC m=+587.328336316" watchObservedRunningTime="2026-02-28 09:13:15.637834225 +0000 UTC m=+587.328403552" Feb 28 09:13:21 crc kubenswrapper[4687]: I0228 09:13:21.868963 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-h6sww" Feb 28 09:13:21 crc kubenswrapper[4687]: I0228 09:13:21.884678 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gg5lw" podStartSLOduration=7.968644539 podStartE2EDuration="10.884662991s" podCreationTimestamp="2026-02-28 09:13:11 +0000 UTC" firstStartedPulling="2026-02-28 09:13:12.249494945 +0000 UTC m=+583.940064283" lastFinishedPulling="2026-02-28 09:13:15.165513399 +0000 UTC m=+586.856082735" observedRunningTime="2026-02-28 09:13:15.651347896 +0000 UTC m=+587.341917234" watchObservedRunningTime="2026-02-28 09:13:21.884662991 +0000 UTC m=+593.575232328" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.471355 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pxxbs"] Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.472067 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-controller" containerID="cri-o://3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.472159 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="nbdb" containerID="cri-o://b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.472200 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-node" containerID="cri-o://0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.472239 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-acl-logging" containerID="cri-o://6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.472176 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="northd" containerID="cri-o://eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.472191 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.474806 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="sbdb" containerID="cri-o://45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.506464 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovnkube-controller" containerID="cri-o://254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" gracePeriod=30 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.664886 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8rkhw_8ee9f985-2783-4c64-913f-c471571a46a3/kube-multus/0.log" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.664938 4687 generic.go:334] "Generic (PLEG): container finished" podID="8ee9f985-2783-4c64-913f-c471571a46a3" containerID="201fdaa3afc315e7615e0485f0fa4a8903fd0890ebeadae45599f1f4dd946034" exitCode=2 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.664993 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8rkhw" event={"ID":"8ee9f985-2783-4c64-913f-c471571a46a3","Type":"ContainerDied","Data":"201fdaa3afc315e7615e0485f0fa4a8903fd0890ebeadae45599f1f4dd946034"} Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.665462 4687 scope.go:117] "RemoveContainer" containerID="201fdaa3afc315e7615e0485f0fa4a8903fd0890ebeadae45599f1f4dd946034" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.672419 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxxbs_4fb29f6b-2e87-454b-966f-5202547e1b6d/ovn-acl-logging/0.log" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.672924 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxxbs_4fb29f6b-2e87-454b-966f-5202547e1b6d/ovn-controller/0.log" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673267 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" exitCode=0 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673355 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" exitCode=0 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673413 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" exitCode=0 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673469 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" exitCode=143 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673523 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" exitCode=143 Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673334 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886"} Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673682 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef"} Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673749 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952"} Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673833 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf"} Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.673905 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524"} Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.750773 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxxbs_4fb29f6b-2e87-454b-966f-5202547e1b6d/ovn-acl-logging/0.log" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.751913 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxxbs_4fb29f6b-2e87-454b-966f-5202547e1b6d/ovn-controller/0.log" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.752350 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790511 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-kubelet\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790594 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-ovn\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790621 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovn-node-metrics-cert\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790618 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790645 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqfcw\" (UniqueName: \"kubernetes.io/projected/4fb29f6b-2e87-454b-966f-5202547e1b6d-kube-api-access-pqfcw\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790662 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-openvswitch\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790663 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790704 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-script-lib\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790724 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-config\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790738 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-env-overrides\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790757 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-netns\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790776 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-systemd-units\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790800 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-ovn-kubernetes\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790816 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-bin\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790928 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-systemd\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790953 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-node-log\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790971 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-var-lib-openvswitch\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.790988 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791045 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-log-socket\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791065 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-netd\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791085 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-slash\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791103 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-etc-openvswitch\") pod \"4fb29f6b-2e87-454b-966f-5202547e1b6d\" (UID: \"4fb29f6b-2e87-454b-966f-5202547e1b6d\") " Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791309 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791375 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791380 4687 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791416 4687 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791432 4687 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791456 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791835 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.791881 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792012 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792105 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792143 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792167 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792172 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-node-log" (OuterVolumeSpecName: "node-log") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792186 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792206 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-log-socket" (OuterVolumeSpecName: "log-socket") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792207 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-slash" (OuterVolumeSpecName: "host-slash") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792349 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.792496 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.793965 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-szdvj"] Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794202 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kubecfg-setup" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794217 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kubecfg-setup" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794227 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-acl-logging" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794233 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-acl-logging" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794240 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-node" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794245 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-node" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794255 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="northd" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794262 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="northd" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794268 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="nbdb" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794273 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="nbdb" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794281 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="sbdb" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794287 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="sbdb" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794296 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794301 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794308 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovnkube-controller" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794313 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovnkube-controller" Feb 28 09:13:22 crc kubenswrapper[4687]: E0228 09:13:22.794320 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-controller" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794325 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-controller" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794418 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-node" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794428 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="northd" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794434 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovnkube-controller" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794442 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="sbdb" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794450 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-acl-logging" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794459 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="nbdb" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794467 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="ovn-controller" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.794474 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.796810 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb29f6b-2e87-454b-966f-5202547e1b6d-kube-api-access-pqfcw" (OuterVolumeSpecName: "kube-api-access-pqfcw") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "kube-api-access-pqfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.797343 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.801849 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.804332 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4fb29f6b-2e87-454b-966f-5202547e1b6d" (UID: "4fb29f6b-2e87-454b-966f-5202547e1b6d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.891846 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.891943 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-etc-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.891996 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-run-netns\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892056 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-ovnkube-config\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892079 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/462d112d-d672-4410-928b-240a62ba95a5-ovn-node-metrics-cert\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892105 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5xs9\" (UniqueName: \"kubernetes.io/projected/462d112d-d672-4410-928b-240a62ba95a5-kube-api-access-c5xs9\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892122 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-systemd-units\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892141 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-slash\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892164 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892184 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-systemd\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892202 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-ovn\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892255 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-cni-bin\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892270 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-cni-netd\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892284 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-node-log\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892295 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-log-socket\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892311 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-kubelet\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892337 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892357 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-var-lib-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892370 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-ovnkube-script-lib\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892397 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-env-overrides\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892441 4687 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-node-log\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892452 4687 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892462 4687 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892470 4687 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-log-socket\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892479 4687 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892487 4687 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-slash\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892495 4687 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892503 4687 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892511 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqfcw\" (UniqueName: \"kubernetes.io/projected/4fb29f6b-2e87-454b-966f-5202547e1b6d-kube-api-access-pqfcw\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892521 4687 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892529 4687 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892537 4687 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892545 4687 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fb29f6b-2e87-454b-966f-5202547e1b6d-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892553 4687 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892561 4687 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892570 4687 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.892578 4687 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fb29f6b-2e87-454b-966f-5202547e1b6d-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992805 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992845 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-systemd\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992865 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-ovn\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992892 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-cni-bin\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992908 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-cni-netd\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992925 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-kubelet\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992937 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-node-log\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992952 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-log-socket\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992971 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992969 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992995 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-ovn\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993030 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-run-systemd\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993033 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-var-lib-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993067 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993078 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-log-socket\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993059 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-cni-bin\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992995 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-cni-netd\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.992988 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-var-lib-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993049 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-kubelet\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993127 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-node-log\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993174 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-ovnkube-script-lib\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993231 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-env-overrides\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993280 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993338 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-etc-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993386 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993404 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-run-netns\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993421 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-etc-openvswitch\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993432 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-ovnkube-config\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993443 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-run-netns\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993458 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/462d112d-d672-4410-928b-240a62ba95a5-ovn-node-metrics-cert\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993501 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5xs9\" (UniqueName: \"kubernetes.io/projected/462d112d-d672-4410-928b-240a62ba95a5-kube-api-access-c5xs9\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993529 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-systemd-units\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993545 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-slash\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.993633 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-host-slash\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.994041 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-env-overrides\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.994160 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-ovnkube-script-lib\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.994213 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/462d112d-d672-4410-928b-240a62ba95a5-systemd-units\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.994227 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/462d112d-d672-4410-928b-240a62ba95a5-ovnkube-config\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:22 crc kubenswrapper[4687]: I0228 09:13:22.997502 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/462d112d-d672-4410-928b-240a62ba95a5-ovn-node-metrics-cert\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.007331 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5xs9\" (UniqueName: \"kubernetes.io/projected/462d112d-d672-4410-928b-240a62ba95a5-kube-api-access-c5xs9\") pod \"ovnkube-node-szdvj\" (UID: \"462d112d-d672-4410-928b-240a62ba95a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.126128 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:23 crc kubenswrapper[4687]: W0228 09:13:23.141433 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod462d112d_d672_4410_928b_240a62ba95a5.slice/crio-67d7c1d1f817e48db958ce0813efeafc3b6fefa8e47b7a7db03c540cc971ca64 WatchSource:0}: Error finding container 67d7c1d1f817e48db958ce0813efeafc3b6fefa8e47b7a7db03c540cc971ca64: Status 404 returned error can't find the container with id 67d7c1d1f817e48db958ce0813efeafc3b6fefa8e47b7a7db03c540cc971ca64 Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.584858 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-grvn5" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.629638 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9h5k"] Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.679585 4687 generic.go:334] "Generic (PLEG): container finished" podID="462d112d-d672-4410-928b-240a62ba95a5" containerID="8f7f3680c12e0cd1627fd911d446ac22c2cc06347c6ec0f097a33679b21021e1" exitCode=0 Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.679649 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerDied","Data":"8f7f3680c12e0cd1627fd911d446ac22c2cc06347c6ec0f097a33679b21021e1"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.679695 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"67d7c1d1f817e48db958ce0813efeafc3b6fefa8e47b7a7db03c540cc971ca64"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.681360 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8rkhw_8ee9f985-2783-4c64-913f-c471571a46a3/kube-multus/0.log" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.681422 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8rkhw" event={"ID":"8ee9f985-2783-4c64-913f-c471571a46a3","Type":"ContainerStarted","Data":"b9d9eae073c8851016635a714af1f0125d820ed5b821fe39b3ac5c6f8d587122"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.684688 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxxbs_4fb29f6b-2e87-454b-966f-5202547e1b6d/ovn-acl-logging/0.log" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685196 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxxbs_4fb29f6b-2e87-454b-966f-5202547e1b6d/ovn-controller/0.log" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685503 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" exitCode=0 Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685525 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" exitCode=0 Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685532 4687 generic.go:334] "Generic (PLEG): container finished" podID="4fb29f6b-2e87-454b-966f-5202547e1b6d" containerID="eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" exitCode=0 Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685550 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685575 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685578 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685586 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685595 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxxbs" event={"ID":"4fb29f6b-2e87-454b-966f-5202547e1b6d","Type":"ContainerDied","Data":"99eb6843f3a9b1bb0df85a41197c4957994630f01196752b0dd5e9b8984d629a"} Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.685612 4687 scope.go:117] "RemoveContainer" containerID="254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.707710 4687 scope.go:117] "RemoveContainer" containerID="45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.738263 4687 scope.go:117] "RemoveContainer" containerID="b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.738772 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pxxbs"] Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.742210 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pxxbs"] Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.754549 4687 scope.go:117] "RemoveContainer" containerID="eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.769646 4687 scope.go:117] "RemoveContainer" containerID="e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.779933 4687 scope.go:117] "RemoveContainer" containerID="0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.791362 4687 scope.go:117] "RemoveContainer" containerID="6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.805624 4687 scope.go:117] "RemoveContainer" containerID="3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.817633 4687 scope.go:117] "RemoveContainer" containerID="1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.840148 4687 scope.go:117] "RemoveContainer" containerID="254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.840906 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": container with ID starting with 254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886 not found: ID does not exist" containerID="254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.840954 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886"} err="failed to get container status \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": rpc error: code = NotFound desc = could not find container \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": container with ID starting with 254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.840986 4687 scope.go:117] "RemoveContainer" containerID="45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.841275 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": container with ID starting with 45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec not found: ID does not exist" containerID="45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.841304 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec"} err="failed to get container status \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": rpc error: code = NotFound desc = could not find container \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": container with ID starting with 45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.841323 4687 scope.go:117] "RemoveContainer" containerID="b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.841563 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": container with ID starting with b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400 not found: ID does not exist" containerID="b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.841590 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400"} err="failed to get container status \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": rpc error: code = NotFound desc = could not find container \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": container with ID starting with b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.841606 4687 scope.go:117] "RemoveContainer" containerID="eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.841916 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": container with ID starting with eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1 not found: ID does not exist" containerID="eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.841947 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1"} err="failed to get container status \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": rpc error: code = NotFound desc = could not find container \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": container with ID starting with eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.841965 4687 scope.go:117] "RemoveContainer" containerID="e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.842184 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": container with ID starting with e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef not found: ID does not exist" containerID="e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.842209 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef"} err="failed to get container status \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": rpc error: code = NotFound desc = could not find container \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": container with ID starting with e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.842228 4687 scope.go:117] "RemoveContainer" containerID="0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.842540 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": container with ID starting with 0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952 not found: ID does not exist" containerID="0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.842695 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952"} err="failed to get container status \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": rpc error: code = NotFound desc = could not find container \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": container with ID starting with 0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.849788 4687 scope.go:117] "RemoveContainer" containerID="6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.851285 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": container with ID starting with 6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf not found: ID does not exist" containerID="6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.851337 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf"} err="failed to get container status \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": rpc error: code = NotFound desc = could not find container \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": container with ID starting with 6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.851374 4687 scope.go:117] "RemoveContainer" containerID="3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.851680 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": container with ID starting with 3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524 not found: ID does not exist" containerID="3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.851708 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524"} err="failed to get container status \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": rpc error: code = NotFound desc = could not find container \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": container with ID starting with 3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.851724 4687 scope.go:117] "RemoveContainer" containerID="1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595" Feb 28 09:13:23 crc kubenswrapper[4687]: E0228 09:13:23.852069 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": container with ID starting with 1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595 not found: ID does not exist" containerID="1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.852124 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595"} err="failed to get container status \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": rpc error: code = NotFound desc = could not find container \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": container with ID starting with 1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.852157 4687 scope.go:117] "RemoveContainer" containerID="254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.852555 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886"} err="failed to get container status \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": rpc error: code = NotFound desc = could not find container \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": container with ID starting with 254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.852595 4687 scope.go:117] "RemoveContainer" containerID="45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.852880 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec"} err="failed to get container status \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": rpc error: code = NotFound desc = could not find container \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": container with ID starting with 45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.852907 4687 scope.go:117] "RemoveContainer" containerID="b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.853204 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400"} err="failed to get container status \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": rpc error: code = NotFound desc = could not find container \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": container with ID starting with b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.853234 4687 scope.go:117] "RemoveContainer" containerID="eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.853487 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1"} err="failed to get container status \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": rpc error: code = NotFound desc = could not find container \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": container with ID starting with eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.853517 4687 scope.go:117] "RemoveContainer" containerID="e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.853779 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef"} err="failed to get container status \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": rpc error: code = NotFound desc = could not find container \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": container with ID starting with e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.853817 4687 scope.go:117] "RemoveContainer" containerID="0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.854246 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952"} err="failed to get container status \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": rpc error: code = NotFound desc = could not find container \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": container with ID starting with 0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.854270 4687 scope.go:117] "RemoveContainer" containerID="6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.854554 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf"} err="failed to get container status \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": rpc error: code = NotFound desc = could not find container \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": container with ID starting with 6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.854581 4687 scope.go:117] "RemoveContainer" containerID="3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.854854 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524"} err="failed to get container status \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": rpc error: code = NotFound desc = could not find container \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": container with ID starting with 3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.854888 4687 scope.go:117] "RemoveContainer" containerID="1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.855212 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595"} err="failed to get container status \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": rpc error: code = NotFound desc = could not find container \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": container with ID starting with 1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.855261 4687 scope.go:117] "RemoveContainer" containerID="254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.855546 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886"} err="failed to get container status \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": rpc error: code = NotFound desc = could not find container \"254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886\": container with ID starting with 254ddd94a12e93d8d2140366c6c34589ae43e5587dcf0c17e398855fc360a886 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.855570 4687 scope.go:117] "RemoveContainer" containerID="45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.855875 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec"} err="failed to get container status \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": rpc error: code = NotFound desc = could not find container \"45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec\": container with ID starting with 45a682fb389eb2922f9a81a4926ed88798abfd97ba87f9f2856af948689542ec not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.855909 4687 scope.go:117] "RemoveContainer" containerID="b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.856220 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400"} err="failed to get container status \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": rpc error: code = NotFound desc = could not find container \"b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400\": container with ID starting with b31592026f046beb9f55087c61629bae72620ad9b911c9119a51fde9fcfbc400 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.856256 4687 scope.go:117] "RemoveContainer" containerID="eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.856484 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1"} err="failed to get container status \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": rpc error: code = NotFound desc = could not find container \"eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1\": container with ID starting with eab119a0723fc00191b2e42af13974c57d3b8a6e093f64a447ef44e6233939f1 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.856507 4687 scope.go:117] "RemoveContainer" containerID="e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.856782 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef"} err="failed to get container status \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": rpc error: code = NotFound desc = could not find container \"e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef\": container with ID starting with e05d992d540a0c3e4ff5b4e369ab55f1b052cba2c72e2326420dc0fe965876ef not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.856805 4687 scope.go:117] "RemoveContainer" containerID="0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.857136 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952"} err="failed to get container status \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": rpc error: code = NotFound desc = could not find container \"0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952\": container with ID starting with 0df0f66d344881d62a51a3d7cb2e78e7d70b39296f6db151f0b32128eef0e952 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.857164 4687 scope.go:117] "RemoveContainer" containerID="6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.857484 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf"} err="failed to get container status \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": rpc error: code = NotFound desc = could not find container \"6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf\": container with ID starting with 6c2efc7fdb93bd1977dfd4f6eb040fd7b07952a96fd088fbedadc3d247b5a8cf not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.857507 4687 scope.go:117] "RemoveContainer" containerID="3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.857765 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524"} err="failed to get container status \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": rpc error: code = NotFound desc = could not find container \"3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524\": container with ID starting with 3315597e6c52853b482ae4832ea6feecfc9df5367e543744629c2c2c01549524 not found: ID does not exist" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.857786 4687 scope.go:117] "RemoveContainer" containerID="1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595" Feb 28 09:13:23 crc kubenswrapper[4687]: I0228 09:13:23.860369 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595"} err="failed to get container status \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": rpc error: code = NotFound desc = could not find container \"1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595\": container with ID starting with 1a421cd585e8ec22da1e4d34b32242f47995669a55b9c04c7c202de0cc70c595 not found: ID does not exist" Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.664011 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb29f6b-2e87-454b-966f-5202547e1b6d" path="/var/lib/kubelet/pods/4fb29f6b-2e87-454b-966f-5202547e1b6d/volumes" Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.694158 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"8f59e3d9b14d22a439fbc9d4bd63a0c32a78e219bf39272b3f98df3e7ee58a68"} Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.694206 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"160bfc37a1a9c41b36e287eb67b0515ff869e6ee1523bfbd740ea9bf9753ff4b"} Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.694219 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"45df33652866ccac22694f5a3040278ca2965f32cfd9b9b4438c210f36fd6931"} Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.694231 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"7dbe63d9c348f94d8110b0fefafee9917fe1865f0ceae0c19e810f59b3a37613"} Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.694241 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"537891bfbe6ebceef22593fc3ed279fb8921f9d3d4dde47b7225655df4ae38d3"} Feb 28 09:13:24 crc kubenswrapper[4687]: I0228 09:13:24.694249 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"3087fe815229eb616e573fff5fa4cd09831db649dc1f1b18641b2e4ec42d3fae"} Feb 28 09:13:25 crc kubenswrapper[4687]: I0228 09:13:25.002333 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:13:25 crc kubenswrapper[4687]: I0228 09:13:25.002394 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:13:26 crc kubenswrapper[4687]: I0228 09:13:26.710693 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"058f4f5bb732652251543fdb6b67e8e72505f4ea258833a3e3efe0d436582b1f"} Feb 28 09:13:28 crc kubenswrapper[4687]: I0228 09:13:28.726867 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" event={"ID":"462d112d-d672-4410-928b-240a62ba95a5","Type":"ContainerStarted","Data":"00237f29d0decd6314e42ddbfdf91acdea8baa91aa7dd7feb1e552d1c4c16f73"} Feb 28 09:13:28 crc kubenswrapper[4687]: I0228 09:13:28.727404 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:28 crc kubenswrapper[4687]: I0228 09:13:28.727417 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:28 crc kubenswrapper[4687]: I0228 09:13:28.760253 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" podStartSLOduration=6.760239145 podStartE2EDuration="6.760239145s" podCreationTimestamp="2026-02-28 09:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:13:28.756621492 +0000 UTC m=+600.447190829" watchObservedRunningTime="2026-02-28 09:13:28.760239145 +0000 UTC m=+600.450808482" Feb 28 09:13:28 crc kubenswrapper[4687]: I0228 09:13:28.777483 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:29 crc kubenswrapper[4687]: I0228 09:13:29.735047 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:29 crc kubenswrapper[4687]: I0228 09:13:29.759460 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:48 crc kubenswrapper[4687]: I0228 09:13:48.656649 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" podUID="76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" containerName="registry" containerID="cri-o://205ec602ca723acc2161f711d005827e53888abe184bc61b1728f6a7baa76c47" gracePeriod=30 Feb 28 09:13:48 crc kubenswrapper[4687]: I0228 09:13:48.835242 4687 generic.go:334] "Generic (PLEG): container finished" podID="76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" containerID="205ec602ca723acc2161f711d005827e53888abe184bc61b1728f6a7baa76c47" exitCode=0 Feb 28 09:13:48 crc kubenswrapper[4687]: I0228 09:13:48.835339 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" event={"ID":"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa","Type":"ContainerDied","Data":"205ec602ca723acc2161f711d005827e53888abe184bc61b1728f6a7baa76c47"} Feb 28 09:13:48 crc kubenswrapper[4687]: I0228 09:13:48.989644 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.029918 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-installation-pull-secrets\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030011 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6278\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-kube-api-access-v6278\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030042 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-bound-sa-token\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030073 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-ca-trust-extracted\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030117 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-trusted-ca\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030156 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-tls\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030324 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.030367 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-certificates\") pod \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\" (UID: \"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa\") " Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.031463 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.032046 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.032319 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.032343 4687 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.037114 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.037923 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.037995 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.038653 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-kube-api-access-v6278" (OuterVolumeSpecName: "kube-api-access-v6278") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "kube-api-access-v6278". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.039454 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.044905 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" (UID: "76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.133407 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6278\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-kube-api-access-v6278\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.133430 4687 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.133439 4687 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.133449 4687 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.133458 4687 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.844052 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" event={"ID":"76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa","Type":"ContainerDied","Data":"4809719ca3f67beaecefed918c1d665433039841f891a1596f02691ec35ece92"} Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.844120 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-n9h5k" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.844543 4687 scope.go:117] "RemoveContainer" containerID="205ec602ca723acc2161f711d005827e53888abe184bc61b1728f6a7baa76c47" Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.874191 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9h5k"] Feb 28 09:13:49 crc kubenswrapper[4687]: I0228 09:13:49.879337 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-n9h5k"] Feb 28 09:13:50 crc kubenswrapper[4687]: I0228 09:13:50.662600 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" path="/var/lib/kubelet/pods/76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa/volumes" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.914543 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd"] Feb 28 09:13:52 crc kubenswrapper[4687]: E0228 09:13:52.914997 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" containerName="registry" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.915010 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" containerName="registry" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.915149 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d4ce2d-d8b3-4d38-91dd-79ee5a0365fa" containerName="registry" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.915809 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.923448 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd"] Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.930351 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.975867 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lwcx\" (UniqueName: \"kubernetes.io/projected/d5bd06a9-5b96-437f-a148-91f7d90e1f00-kube-api-access-5lwcx\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.975933 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:52 crc kubenswrapper[4687]: I0228 09:13:52.975952 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.077077 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lwcx\" (UniqueName: \"kubernetes.io/projected/d5bd06a9-5b96-437f-a148-91f7d90e1f00-kube-api-access-5lwcx\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.077170 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.077192 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.077640 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.077797 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.095003 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lwcx\" (UniqueName: \"kubernetes.io/projected/d5bd06a9-5b96-437f-a148-91f7d90e1f00-kube-api-access-5lwcx\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.151925 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-szdvj" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.230985 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.604217 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd"] Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.864328 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" event={"ID":"d5bd06a9-5b96-437f-a148-91f7d90e1f00","Type":"ContainerStarted","Data":"1d609bc9b828cdae48fff88a69ff750f2dbc09e09ca08ce293cd33948fdcc11a"} Feb 28 09:13:53 crc kubenswrapper[4687]: I0228 09:13:53.864367 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" event={"ID":"d5bd06a9-5b96-437f-a148-91f7d90e1f00","Type":"ContainerStarted","Data":"9d14b40017967f3c1985a012b071e56f34741bae2e6d614529ee4996edc39585"} Feb 28 09:13:54 crc kubenswrapper[4687]: I0228 09:13:54.871052 4687 generic.go:334] "Generic (PLEG): container finished" podID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerID="1d609bc9b828cdae48fff88a69ff750f2dbc09e09ca08ce293cd33948fdcc11a" exitCode=0 Feb 28 09:13:54 crc kubenswrapper[4687]: I0228 09:13:54.871259 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" event={"ID":"d5bd06a9-5b96-437f-a148-91f7d90e1f00","Type":"ContainerDied","Data":"1d609bc9b828cdae48fff88a69ff750f2dbc09e09ca08ce293cd33948fdcc11a"} Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.002471 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.002544 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.002597 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.003142 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcbde49ebdbfb08d03f55668dbe45e77e9c15c2d2f6e5cdcc206fabca01051bf"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.003212 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://bcbde49ebdbfb08d03f55668dbe45e77e9c15c2d2f6e5cdcc206fabca01051bf" gracePeriod=600 Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.880985 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="bcbde49ebdbfb08d03f55668dbe45e77e9c15c2d2f6e5cdcc206fabca01051bf" exitCode=0 Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.881051 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"bcbde49ebdbfb08d03f55668dbe45e77e9c15c2d2f6e5cdcc206fabca01051bf"} Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.881711 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"e2099836a5e3e90d046dbb8521988fee6933b3b356479c2ff7510ccbe5caaedf"} Feb 28 09:13:55 crc kubenswrapper[4687]: I0228 09:13:55.881735 4687 scope.go:117] "RemoveContainer" containerID="fad4ec2f45b132fa1fcbba9b5a4c5891531193748a3177bf121c290113487ba4" Feb 28 09:13:56 crc kubenswrapper[4687]: I0228 09:13:56.891353 4687 generic.go:334] "Generic (PLEG): container finished" podID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerID="dc98ec24ae08ec419052491f50ca8338b354dccd692bed01d2ffc0de7e075484" exitCode=0 Feb 28 09:13:56 crc kubenswrapper[4687]: I0228 09:13:56.891462 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" event={"ID":"d5bd06a9-5b96-437f-a148-91f7d90e1f00","Type":"ContainerDied","Data":"dc98ec24ae08ec419052491f50ca8338b354dccd692bed01d2ffc0de7e075484"} Feb 28 09:13:57 crc kubenswrapper[4687]: I0228 09:13:57.906102 4687 generic.go:334] "Generic (PLEG): container finished" podID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerID="358bd758d32b4c2d148f0c7b52ba6f1f6adc06f510458323bdbc36f997b31270" exitCode=0 Feb 28 09:13:57 crc kubenswrapper[4687]: I0228 09:13:57.906223 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" event={"ID":"d5bd06a9-5b96-437f-a148-91f7d90e1f00","Type":"ContainerDied","Data":"358bd758d32b4c2d148f0c7b52ba6f1f6adc06f510458323bdbc36f997b31270"} Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.109580 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.150395 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lwcx\" (UniqueName: \"kubernetes.io/projected/d5bd06a9-5b96-437f-a148-91f7d90e1f00-kube-api-access-5lwcx\") pod \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.150494 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-bundle\") pod \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.150521 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-util\") pod \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\" (UID: \"d5bd06a9-5b96-437f-a148-91f7d90e1f00\") " Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.150980 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-bundle" (OuterVolumeSpecName: "bundle") pod "d5bd06a9-5b96-437f-a148-91f7d90e1f00" (UID: "d5bd06a9-5b96-437f-a148-91f7d90e1f00"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.155652 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bd06a9-5b96-437f-a148-91f7d90e1f00-kube-api-access-5lwcx" (OuterVolumeSpecName: "kube-api-access-5lwcx") pod "d5bd06a9-5b96-437f-a148-91f7d90e1f00" (UID: "d5bd06a9-5b96-437f-a148-91f7d90e1f00"). InnerVolumeSpecName "kube-api-access-5lwcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.159457 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-util" (OuterVolumeSpecName: "util") pod "d5bd06a9-5b96-437f-a148-91f7d90e1f00" (UID: "d5bd06a9-5b96-437f-a148-91f7d90e1f00"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.252268 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lwcx\" (UniqueName: \"kubernetes.io/projected/d5bd06a9-5b96-437f-a148-91f7d90e1f00-kube-api-access-5lwcx\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.252305 4687 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.252316 4687 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5bd06a9-5b96-437f-a148-91f7d90e1f00-util\") on node \"crc\" DevicePath \"\"" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.923743 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" event={"ID":"d5bd06a9-5b96-437f-a148-91f7d90e1f00","Type":"ContainerDied","Data":"9d14b40017967f3c1985a012b071e56f34741bae2e6d614529ee4996edc39585"} Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.923811 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d14b40017967f3c1985a012b071e56f34741bae2e6d614529ee4996edc39585" Feb 28 09:13:59 crc kubenswrapper[4687]: I0228 09:13:59.924163 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.131585 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537834-rb4jt"] Feb 28 09:14:00 crc kubenswrapper[4687]: E0228 09:14:00.132489 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="util" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.132513 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="util" Feb 28 09:14:00 crc kubenswrapper[4687]: E0228 09:14:00.132532 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="pull" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.132539 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="pull" Feb 28 09:14:00 crc kubenswrapper[4687]: E0228 09:14:00.132551 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="extract" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.132559 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="extract" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.132702 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bd06a9-5b96-437f-a148-91f7d90e1f00" containerName="extract" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.133401 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.135811 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.135986 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.136289 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.139085 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537834-rb4jt"] Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.162316 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcng6\" (UniqueName: \"kubernetes.io/projected/290aa17f-f371-42cf-875b-7166fc432dd2-kube-api-access-bcng6\") pod \"auto-csr-approver-29537834-rb4jt\" (UID: \"290aa17f-f371-42cf-875b-7166fc432dd2\") " pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.263641 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcng6\" (UniqueName: \"kubernetes.io/projected/290aa17f-f371-42cf-875b-7166fc432dd2-kube-api-access-bcng6\") pod \"auto-csr-approver-29537834-rb4jt\" (UID: \"290aa17f-f371-42cf-875b-7166fc432dd2\") " pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.279468 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcng6\" (UniqueName: \"kubernetes.io/projected/290aa17f-f371-42cf-875b-7166fc432dd2-kube-api-access-bcng6\") pod \"auto-csr-approver-29537834-rb4jt\" (UID: \"290aa17f-f371-42cf-875b-7166fc432dd2\") " pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.449696 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.591611 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537834-rb4jt"] Feb 28 09:14:00 crc kubenswrapper[4687]: W0228 09:14:00.597648 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290aa17f_f371_42cf_875b_7166fc432dd2.slice/crio-bb2550b8dbeeb37f994bbc19db6ead406a220d29228f8be5b72c1a851153cfb3 WatchSource:0}: Error finding container bb2550b8dbeeb37f994bbc19db6ead406a220d29228f8be5b72c1a851153cfb3: Status 404 returned error can't find the container with id bb2550b8dbeeb37f994bbc19db6ead406a220d29228f8be5b72c1a851153cfb3 Feb 28 09:14:00 crc kubenswrapper[4687]: I0228 09:14:00.935137 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" event={"ID":"290aa17f-f371-42cf-875b-7166fc432dd2","Type":"ContainerStarted","Data":"bb2550b8dbeeb37f994bbc19db6ead406a220d29228f8be5b72c1a851153cfb3"} Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.544779 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42"] Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.545444 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.549382 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.549550 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c6n8n" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.549979 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.554688 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42"] Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.578639 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shzkk\" (UniqueName: \"kubernetes.io/projected/8da54ae4-877e-4e38-890c-8eabef7c7033-kube-api-access-shzkk\") pod \"nmstate-operator-75c5dccd6c-5sb42\" (UID: \"8da54ae4-877e-4e38-890c-8eabef7c7033\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.679686 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shzkk\" (UniqueName: \"kubernetes.io/projected/8da54ae4-877e-4e38-890c-8eabef7c7033-kube-api-access-shzkk\") pod \"nmstate-operator-75c5dccd6c-5sb42\" (UID: \"8da54ae4-877e-4e38-890c-8eabef7c7033\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.698185 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shzkk\" (UniqueName: \"kubernetes.io/projected/8da54ae4-877e-4e38-890c-8eabef7c7033-kube-api-access-shzkk\") pod \"nmstate-operator-75c5dccd6c-5sb42\" (UID: \"8da54ae4-877e-4e38-890c-8eabef7c7033\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.858034 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.949310 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" event={"ID":"290aa17f-f371-42cf-875b-7166fc432dd2","Type":"ContainerStarted","Data":"278b6ec447930ef86d38816dae5a24aeeb143cda885ae03ff21e8c66d663fc24"} Feb 28 09:14:01 crc kubenswrapper[4687]: I0228 09:14:01.965109 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" podStartSLOduration=0.990849352 podStartE2EDuration="1.965083152s" podCreationTimestamp="2026-02-28 09:14:00 +0000 UTC" firstStartedPulling="2026-02-28 09:14:00.600527432 +0000 UTC m=+632.291096770" lastFinishedPulling="2026-02-28 09:14:01.574761233 +0000 UTC m=+633.265330570" observedRunningTime="2026-02-28 09:14:01.959267888 +0000 UTC m=+633.649837225" watchObservedRunningTime="2026-02-28 09:14:01.965083152 +0000 UTC m=+633.655652490" Feb 28 09:14:02 crc kubenswrapper[4687]: I0228 09:14:02.272900 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42"] Feb 28 09:14:02 crc kubenswrapper[4687]: W0228 09:14:02.279691 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8da54ae4_877e_4e38_890c_8eabef7c7033.slice/crio-63ef71df312a09407dd3945db5a6494c861a3003d6c327a7e0e34a469c7ba2b8 WatchSource:0}: Error finding container 63ef71df312a09407dd3945db5a6494c861a3003d6c327a7e0e34a469c7ba2b8: Status 404 returned error can't find the container with id 63ef71df312a09407dd3945db5a6494c861a3003d6c327a7e0e34a469c7ba2b8 Feb 28 09:14:02 crc kubenswrapper[4687]: I0228 09:14:02.955450 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" event={"ID":"8da54ae4-877e-4e38-890c-8eabef7c7033","Type":"ContainerStarted","Data":"63ef71df312a09407dd3945db5a6494c861a3003d6c327a7e0e34a469c7ba2b8"} Feb 28 09:14:02 crc kubenswrapper[4687]: I0228 09:14:02.957307 4687 generic.go:334] "Generic (PLEG): container finished" podID="290aa17f-f371-42cf-875b-7166fc432dd2" containerID="278b6ec447930ef86d38816dae5a24aeeb143cda885ae03ff21e8c66d663fc24" exitCode=0 Feb 28 09:14:02 crc kubenswrapper[4687]: I0228 09:14:02.957341 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" event={"ID":"290aa17f-f371-42cf-875b-7166fc432dd2","Type":"ContainerDied","Data":"278b6ec447930ef86d38816dae5a24aeeb143cda885ae03ff21e8c66d663fc24"} Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.383666 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.414728 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcng6\" (UniqueName: \"kubernetes.io/projected/290aa17f-f371-42cf-875b-7166fc432dd2-kube-api-access-bcng6\") pod \"290aa17f-f371-42cf-875b-7166fc432dd2\" (UID: \"290aa17f-f371-42cf-875b-7166fc432dd2\") " Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.421486 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290aa17f-f371-42cf-875b-7166fc432dd2-kube-api-access-bcng6" (OuterVolumeSpecName: "kube-api-access-bcng6") pod "290aa17f-f371-42cf-875b-7166fc432dd2" (UID: "290aa17f-f371-42cf-875b-7166fc432dd2"). InnerVolumeSpecName "kube-api-access-bcng6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.517737 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcng6\" (UniqueName: \"kubernetes.io/projected/290aa17f-f371-42cf-875b-7166fc432dd2-kube-api-access-bcng6\") on node \"crc\" DevicePath \"\"" Feb 28 09:14:04 crc kubenswrapper[4687]: E0228 09:14:04.738224 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290aa17f_f371_42cf_875b_7166fc432dd2.slice\": RecentStats: unable to find data in memory cache]" Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.970895 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" event={"ID":"8da54ae4-877e-4e38-890c-8eabef7c7033","Type":"ContainerStarted","Data":"b36e96777733d0d8353fbdc12a41796e8c5a0604ad9aa0318ac34513640f0b3e"} Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.972640 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" event={"ID":"290aa17f-f371-42cf-875b-7166fc432dd2","Type":"ContainerDied","Data":"bb2550b8dbeeb37f994bbc19db6ead406a220d29228f8be5b72c1a851153cfb3"} Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.972686 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2550b8dbeeb37f994bbc19db6ead406a220d29228f8be5b72c1a851153cfb3" Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.972688 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537834-rb4jt" Feb 28 09:14:04 crc kubenswrapper[4687]: I0228 09:14:04.986650 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5sb42" podStartSLOduration=1.894224732 podStartE2EDuration="3.986626912s" podCreationTimestamp="2026-02-28 09:14:01 +0000 UTC" firstStartedPulling="2026-02-28 09:14:02.282396258 +0000 UTC m=+633.972965595" lastFinishedPulling="2026-02-28 09:14:04.374798439 +0000 UTC m=+636.065367775" observedRunningTime="2026-02-28 09:14:04.98570596 +0000 UTC m=+636.676275297" watchObservedRunningTime="2026-02-28 09:14:04.986626912 +0000 UTC m=+636.677196249" Feb 28 09:14:05 crc kubenswrapper[4687]: I0228 09:14:05.012917 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537828-kf7k8"] Feb 28 09:14:05 crc kubenswrapper[4687]: I0228 09:14:05.016969 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537828-kf7k8"] Feb 28 09:14:06 crc kubenswrapper[4687]: I0228 09:14:06.661763 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99cb978-113d-4662-b69e-04425f442f83" path="/var/lib/kubelet/pods/f99cb978-113d-4662-b69e-04425f442f83/volumes" Feb 28 09:14:37 crc kubenswrapper[4687]: I0228 09:14:37.406131 4687 scope.go:117] "RemoveContainer" containerID="65b8e29dcdb142ad307015f50a6b7f202bd2872d87fa74ea449f42cb0938bde9" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.363714 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p"] Feb 28 09:14:47 crc kubenswrapper[4687]: E0228 09:14:47.365356 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290aa17f-f371-42cf-875b-7166fc432dd2" containerName="oc" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.365431 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="290aa17f-f371-42cf-875b-7166fc432dd2" containerName="oc" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.365610 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="290aa17f-f371-42cf-875b-7166fc432dd2" containerName="oc" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.366107 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.367918 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.368098 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-c66wz"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.368247 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2f7x4" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.368987 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.382702 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6lptp"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.383354 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.394347 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-c66wz"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403176 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-dbus-socket\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403319 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-ovs-socket\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403399 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dc19058-cbce-4742-9c1f-11005a9aefbf-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403476 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-nmstate-lock\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403605 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwz7h\" (UniqueName: \"kubernetes.io/projected/88c47658-dd20-4f97-b063-b95f5bd2d79d-kube-api-access-wwz7h\") pod \"nmstate-metrics-69594cc75-c66wz\" (UID: \"88c47658-dd20-4f97-b063-b95f5bd2d79d\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403695 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzlc\" (UniqueName: \"kubernetes.io/projected/5dc19058-cbce-4742-9c1f-11005a9aefbf-kube-api-access-cfzlc\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.403874 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9267\" (UniqueName: \"kubernetes.io/projected/02027bc3-0840-49ed-afe6-13d5285bdff9-kube-api-access-m9267\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.408635 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.446998 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.447553 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.449357 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.449548 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.449740 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-2zpc9" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.466427 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504379 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-ovs-socket\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504411 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dc19058-cbce-4742-9c1f-11005a9aefbf-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504433 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-nmstate-lock\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504470 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwz7h\" (UniqueName: \"kubernetes.io/projected/88c47658-dd20-4f97-b063-b95f5bd2d79d-kube-api-access-wwz7h\") pod \"nmstate-metrics-69594cc75-c66wz\" (UID: \"88c47658-dd20-4f97-b063-b95f5bd2d79d\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504483 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-ovs-socket\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504493 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzlc\" (UniqueName: \"kubernetes.io/projected/5dc19058-cbce-4742-9c1f-11005a9aefbf-kube-api-access-cfzlc\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504528 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe724a47-e6db-4940-885f-318abb45fb46-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504560 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldg2b\" (UniqueName: \"kubernetes.io/projected/fe724a47-e6db-4940-885f-318abb45fb46-kube-api-access-ldg2b\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: E0228 09:14:47.504576 4687 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504591 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9267\" (UniqueName: \"kubernetes.io/projected/02027bc3-0840-49ed-afe6-13d5285bdff9-kube-api-access-m9267\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504615 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe724a47-e6db-4940-885f-318abb45fb46-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: E0228 09:14:47.504642 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dc19058-cbce-4742-9c1f-11005a9aefbf-tls-key-pair podName:5dc19058-cbce-4742-9c1f-11005a9aefbf nodeName:}" failed. No retries permitted until 2026-02-28 09:14:48.004623066 +0000 UTC m=+679.695192402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5dc19058-cbce-4742-9c1f-11005a9aefbf-tls-key-pair") pod "nmstate-webhook-786f45cff4-8kg5p" (UID: "5dc19058-cbce-4742-9c1f-11005a9aefbf") : secret "openshift-nmstate-webhook" not found Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504654 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-dbus-socket\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504734 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-nmstate-lock\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.504887 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/02027bc3-0840-49ed-afe6-13d5285bdff9-dbus-socket\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.520112 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9267\" (UniqueName: \"kubernetes.io/projected/02027bc3-0840-49ed-afe6-13d5285bdff9-kube-api-access-m9267\") pod \"nmstate-handler-6lptp\" (UID: \"02027bc3-0840-49ed-afe6-13d5285bdff9\") " pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.520151 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwz7h\" (UniqueName: \"kubernetes.io/projected/88c47658-dd20-4f97-b063-b95f5bd2d79d-kube-api-access-wwz7h\") pod \"nmstate-metrics-69594cc75-c66wz\" (UID: \"88c47658-dd20-4f97-b063-b95f5bd2d79d\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.520624 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzlc\" (UniqueName: \"kubernetes.io/projected/5dc19058-cbce-4742-9c1f-11005a9aefbf-kube-api-access-cfzlc\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.600583 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cfc4f8755-dkqc8"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.601276 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605486 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-oauth-serving-cert\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605525 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-config\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605544 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqnhq\" (UniqueName: \"kubernetes.io/projected/b565164f-b73d-43a9-a4a0-4501a215f7d7-kube-api-access-xqnhq\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605559 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-oauth-config\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605698 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe724a47-e6db-4940-885f-318abb45fb46-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605746 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-service-ca\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605847 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldg2b\" (UniqueName: \"kubernetes.io/projected/fe724a47-e6db-4940-885f-318abb45fb46-kube-api-access-ldg2b\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605887 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-trusted-ca-bundle\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605914 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-serving-cert\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.605935 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe724a47-e6db-4940-885f-318abb45fb46-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.606702 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe724a47-e6db-4940-885f-318abb45fb46-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.609596 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe724a47-e6db-4940-885f-318abb45fb46-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.609621 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cfc4f8755-dkqc8"] Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.629801 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldg2b\" (UniqueName: \"kubernetes.io/projected/fe724a47-e6db-4940-885f-318abb45fb46-kube-api-access-ldg2b\") pod \"nmstate-console-plugin-5dcbbd79cf-pq26r\" (UID: \"fe724a47-e6db-4940-885f-318abb45fb46\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.693047 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.698048 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.706769 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-oauth-serving-cert\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.706816 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-config\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.706847 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnhq\" (UniqueName: \"kubernetes.io/projected/b565164f-b73d-43a9-a4a0-4501a215f7d7-kube-api-access-xqnhq\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.706864 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-oauth-config\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.707094 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-service-ca\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.707150 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-trusted-ca-bundle\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.707178 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-serving-cert\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.707661 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-config\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.707728 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-oauth-serving-cert\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.707926 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-service-ca\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.708154 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b565164f-b73d-43a9-a4a0-4501a215f7d7-trusted-ca-bundle\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.710253 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-oauth-config\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.710258 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b565164f-b73d-43a9-a4a0-4501a215f7d7-console-serving-cert\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.722698 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqnhq\" (UniqueName: \"kubernetes.io/projected/b565164f-b73d-43a9-a4a0-4501a215f7d7-kube-api-access-xqnhq\") pod \"console-7cfc4f8755-dkqc8\" (UID: \"b565164f-b73d-43a9-a4a0-4501a215f7d7\") " pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.773600 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" Feb 28 09:14:47 crc kubenswrapper[4687]: I0228 09:14:47.926507 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.010547 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dc19058-cbce-4742-9c1f-11005a9aefbf-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.013564 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dc19058-cbce-4742-9c1f-11005a9aefbf-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-8kg5p\" (UID: \"5dc19058-cbce-4742-9c1f-11005a9aefbf\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.033699 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-c66wz"] Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.078526 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cfc4f8755-dkqc8"] Feb 28 09:14:48 crc kubenswrapper[4687]: W0228 09:14:48.091035 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb565164f_b73d_43a9_a4a0_4501a215f7d7.slice/crio-61fc755656bc969cc6289424eb6be759e0b11adb56fbc59f5aa283c2f05b153a WatchSource:0}: Error finding container 61fc755656bc969cc6289424eb6be759e0b11adb56fbc59f5aa283c2f05b153a: Status 404 returned error can't find the container with id 61fc755656bc969cc6289424eb6be759e0b11adb56fbc59f5aa283c2f05b153a Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.119607 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r"] Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.183177 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6lptp" event={"ID":"02027bc3-0840-49ed-afe6-13d5285bdff9","Type":"ContainerStarted","Data":"ff96d9c0faa8473ee2bdb38c0590c893c53379ec9297f6ee53c488cda6340811"} Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.184010 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cfc4f8755-dkqc8" event={"ID":"b565164f-b73d-43a9-a4a0-4501a215f7d7","Type":"ContainerStarted","Data":"61fc755656bc969cc6289424eb6be759e0b11adb56fbc59f5aa283c2f05b153a"} Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.185676 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" event={"ID":"fe724a47-e6db-4940-885f-318abb45fb46","Type":"ContainerStarted","Data":"c1705d40bd1374ed3ad1c4cec290e7d407595bb5feeeac35bf306deb725731b6"} Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.186682 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" event={"ID":"88c47658-dd20-4f97-b063-b95f5bd2d79d","Type":"ContainerStarted","Data":"e0a75e2d57bf45419cc9345191c2e7760deaf7945baffa15fffed121716401cb"} Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.283846 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:48 crc kubenswrapper[4687]: I0228 09:14:48.629014 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p"] Feb 28 09:14:49 crc kubenswrapper[4687]: I0228 09:14:49.193683 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cfc4f8755-dkqc8" event={"ID":"b565164f-b73d-43a9-a4a0-4501a215f7d7","Type":"ContainerStarted","Data":"a2efac92eb06b1f22c73389205deb87c5b39904ebc67ab277eebd38586a4de46"} Feb 28 09:14:49 crc kubenswrapper[4687]: I0228 09:14:49.194722 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" event={"ID":"5dc19058-cbce-4742-9c1f-11005a9aefbf","Type":"ContainerStarted","Data":"34a7d8d019c567fd456b4bad1f56f521d41c24995fd6ee92463cc701aea02715"} Feb 28 09:14:49 crc kubenswrapper[4687]: I0228 09:14:49.221754 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cfc4f8755-dkqc8" podStartSLOduration=2.221725436 podStartE2EDuration="2.221725436s" podCreationTimestamp="2026-02-28 09:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:14:49.20942715 +0000 UTC m=+680.899996497" watchObservedRunningTime="2026-02-28 09:14:49.221725436 +0000 UTC m=+680.912294773" Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.209550 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6lptp" event={"ID":"02027bc3-0840-49ed-afe6-13d5285bdff9","Type":"ContainerStarted","Data":"a74df80d58a53cb8d48d7df1ccfe9be59064e28df42a6ef2cd1c73ce3fbd18dc"} Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.210088 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.211394 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" event={"ID":"5dc19058-cbce-4742-9c1f-11005a9aefbf","Type":"ContainerStarted","Data":"9c4c5d09e81c16185db0c14e73321b8790dcf5b090b3768fae453ab072675fab"} Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.212204 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.214184 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" event={"ID":"fe724a47-e6db-4940-885f-318abb45fb46","Type":"ContainerStarted","Data":"cb5ca9ceff69bfb6980de041da7a3842b15e2f2c22d023d5a0db0d8417b05095"} Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.215421 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" event={"ID":"88c47658-dd20-4f97-b063-b95f5bd2d79d","Type":"ContainerStarted","Data":"fc49e36aa43a68fa4f319d69861f25b7bd77d0f980bf3d9db83305413da25a47"} Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.227826 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6lptp" podStartSLOduration=1.302928197 podStartE2EDuration="4.227813891s" podCreationTimestamp="2026-02-28 09:14:47 +0000 UTC" firstStartedPulling="2026-02-28 09:14:47.717639664 +0000 UTC m=+679.408209001" lastFinishedPulling="2026-02-28 09:14:50.642525357 +0000 UTC m=+682.333094695" observedRunningTime="2026-02-28 09:14:51.223202038 +0000 UTC m=+682.913771375" watchObservedRunningTime="2026-02-28 09:14:51.227813891 +0000 UTC m=+682.918383229" Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.238586 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-pq26r" podStartSLOduration=1.748639559 podStartE2EDuration="4.238570228s" podCreationTimestamp="2026-02-28 09:14:47 +0000 UTC" firstStartedPulling="2026-02-28 09:14:48.15310548 +0000 UTC m=+679.843674816" lastFinishedPulling="2026-02-28 09:14:50.643036147 +0000 UTC m=+682.333605485" observedRunningTime="2026-02-28 09:14:51.236325105 +0000 UTC m=+682.926894462" watchObservedRunningTime="2026-02-28 09:14:51.238570228 +0000 UTC m=+682.929139564" Feb 28 09:14:51 crc kubenswrapper[4687]: I0228 09:14:51.250502 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" podStartSLOduration=2.236882689 podStartE2EDuration="4.250483159s" podCreationTimestamp="2026-02-28 09:14:47 +0000 UTC" firstStartedPulling="2026-02-28 09:14:48.636065374 +0000 UTC m=+680.326634711" lastFinishedPulling="2026-02-28 09:14:50.649665845 +0000 UTC m=+682.340235181" observedRunningTime="2026-02-28 09:14:51.249917716 +0000 UTC m=+682.940487053" watchObservedRunningTime="2026-02-28 09:14:51.250483159 +0000 UTC m=+682.941052496" Feb 28 09:14:53 crc kubenswrapper[4687]: I0228 09:14:53.229408 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" event={"ID":"88c47658-dd20-4f97-b063-b95f5bd2d79d","Type":"ContainerStarted","Data":"064eb5ced1e22767ada6be2d9dea3861c1d3e50ee350eeb5d9bc501c442af93e"} Feb 28 09:14:53 crc kubenswrapper[4687]: I0228 09:14:53.243920 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-c66wz" podStartSLOduration=1.411076391 podStartE2EDuration="6.243898463s" podCreationTimestamp="2026-02-28 09:14:47 +0000 UTC" firstStartedPulling="2026-02-28 09:14:48.041043157 +0000 UTC m=+679.731612493" lastFinishedPulling="2026-02-28 09:14:52.873865228 +0000 UTC m=+684.564434565" observedRunningTime="2026-02-28 09:14:53.243108517 +0000 UTC m=+684.933677854" watchObservedRunningTime="2026-02-28 09:14:53.243898463 +0000 UTC m=+684.934467800" Feb 28 09:14:57 crc kubenswrapper[4687]: I0228 09:14:57.717417 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6lptp" Feb 28 09:14:57 crc kubenswrapper[4687]: I0228 09:14:57.926677 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:57 crc kubenswrapper[4687]: I0228 09:14:57.926752 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:57 crc kubenswrapper[4687]: I0228 09:14:57.930651 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:58 crc kubenswrapper[4687]: I0228 09:14:58.260012 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cfc4f8755-dkqc8" Feb 28 09:14:58 crc kubenswrapper[4687]: I0228 09:14:58.304653 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4m8kh"] Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.131410 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd"] Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.132245 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.134474 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.134892 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.139886 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd"] Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.166306 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e3383d8-2679-40ea-97e5-fbd106b18c91-config-volume\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.166360 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgmq\" (UniqueName: \"kubernetes.io/projected/2e3383d8-2679-40ea-97e5-fbd106b18c91-kube-api-access-tdgmq\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.166517 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e3383d8-2679-40ea-97e5-fbd106b18c91-secret-volume\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.267714 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e3383d8-2679-40ea-97e5-fbd106b18c91-config-volume\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.267794 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgmq\" (UniqueName: \"kubernetes.io/projected/2e3383d8-2679-40ea-97e5-fbd106b18c91-kube-api-access-tdgmq\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.267867 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e3383d8-2679-40ea-97e5-fbd106b18c91-secret-volume\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.268515 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e3383d8-2679-40ea-97e5-fbd106b18c91-config-volume\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.273827 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e3383d8-2679-40ea-97e5-fbd106b18c91-secret-volume\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.283508 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgmq\" (UniqueName: \"kubernetes.io/projected/2e3383d8-2679-40ea-97e5-fbd106b18c91-kube-api-access-tdgmq\") pod \"collect-profiles-29537835-9grsd\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.447760 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:00 crc kubenswrapper[4687]: I0228 09:15:00.624158 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd"] Feb 28 09:15:01 crc kubenswrapper[4687]: I0228 09:15:01.273826 4687 generic.go:334] "Generic (PLEG): container finished" podID="2e3383d8-2679-40ea-97e5-fbd106b18c91" containerID="447e228198642fa20aab6fcd3d719ed69156fd00567e259cc4619d3b355f03a5" exitCode=0 Feb 28 09:15:01 crc kubenswrapper[4687]: I0228 09:15:01.273902 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" event={"ID":"2e3383d8-2679-40ea-97e5-fbd106b18c91","Type":"ContainerDied","Data":"447e228198642fa20aab6fcd3d719ed69156fd00567e259cc4619d3b355f03a5"} Feb 28 09:15:01 crc kubenswrapper[4687]: I0228 09:15:01.274258 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" event={"ID":"2e3383d8-2679-40ea-97e5-fbd106b18c91","Type":"ContainerStarted","Data":"4899b8bae9d3e775608a444b810c4b2a7358ddafa0ea3d4736a7aa1b4d1e1954"} Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.466181 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.595715 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e3383d8-2679-40ea-97e5-fbd106b18c91-secret-volume\") pod \"2e3383d8-2679-40ea-97e5-fbd106b18c91\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.595829 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdgmq\" (UniqueName: \"kubernetes.io/projected/2e3383d8-2679-40ea-97e5-fbd106b18c91-kube-api-access-tdgmq\") pod \"2e3383d8-2679-40ea-97e5-fbd106b18c91\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.595870 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e3383d8-2679-40ea-97e5-fbd106b18c91-config-volume\") pod \"2e3383d8-2679-40ea-97e5-fbd106b18c91\" (UID: \"2e3383d8-2679-40ea-97e5-fbd106b18c91\") " Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.596923 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e3383d8-2679-40ea-97e5-fbd106b18c91-config-volume" (OuterVolumeSpecName: "config-volume") pod "2e3383d8-2679-40ea-97e5-fbd106b18c91" (UID: "2e3383d8-2679-40ea-97e5-fbd106b18c91"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.602767 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3383d8-2679-40ea-97e5-fbd106b18c91-kube-api-access-tdgmq" (OuterVolumeSpecName: "kube-api-access-tdgmq") pod "2e3383d8-2679-40ea-97e5-fbd106b18c91" (UID: "2e3383d8-2679-40ea-97e5-fbd106b18c91"). InnerVolumeSpecName "kube-api-access-tdgmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.611869 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e3383d8-2679-40ea-97e5-fbd106b18c91-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2e3383d8-2679-40ea-97e5-fbd106b18c91" (UID: "2e3383d8-2679-40ea-97e5-fbd106b18c91"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.697978 4687 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e3383d8-2679-40ea-97e5-fbd106b18c91-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.698015 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdgmq\" (UniqueName: \"kubernetes.io/projected/2e3383d8-2679-40ea-97e5-fbd106b18c91-kube-api-access-tdgmq\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:02 crc kubenswrapper[4687]: I0228 09:15:02.698047 4687 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e3383d8-2679-40ea-97e5-fbd106b18c91-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:03 crc kubenswrapper[4687]: I0228 09:15:03.288316 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" event={"ID":"2e3383d8-2679-40ea-97e5-fbd106b18c91","Type":"ContainerDied","Data":"4899b8bae9d3e775608a444b810c4b2a7358ddafa0ea3d4736a7aa1b4d1e1954"} Feb 28 09:15:03 crc kubenswrapper[4687]: I0228 09:15:03.288538 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4899b8bae9d3e775608a444b810c4b2a7358ddafa0ea3d4736a7aa1b4d1e1954" Feb 28 09:15:03 crc kubenswrapper[4687]: I0228 09:15:03.288390 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd" Feb 28 09:15:08 crc kubenswrapper[4687]: I0228 09:15:08.291416 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-8kg5p" Feb 28 09:15:16 crc kubenswrapper[4687]: I0228 09:15:16.662055 4687 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.335203 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7"] Feb 28 09:15:20 crc kubenswrapper[4687]: E0228 09:15:20.335819 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3383d8-2679-40ea-97e5-fbd106b18c91" containerName="collect-profiles" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.335832 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3383d8-2679-40ea-97e5-fbd106b18c91" containerName="collect-profiles" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.335924 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e3383d8-2679-40ea-97e5-fbd106b18c91" containerName="collect-profiles" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.336614 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.338187 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.348469 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7"] Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.536245 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.536298 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzs88\" (UniqueName: \"kubernetes.io/projected/993f721e-f5f5-4e7e-9896-5931bd6e0023-kube-api-access-hzs88\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.536369 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.637737 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.637815 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.637857 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzs88\" (UniqueName: \"kubernetes.io/projected/993f721e-f5f5-4e7e-9896-5931bd6e0023-kube-api-access-hzs88\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.638531 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.638531 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.657930 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzs88\" (UniqueName: \"kubernetes.io/projected/993f721e-f5f5-4e7e-9896-5931bd6e0023-kube-api-access-hzs88\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:20 crc kubenswrapper[4687]: I0228 09:15:20.660107 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:21 crc kubenswrapper[4687]: I0228 09:15:21.046042 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7"] Feb 28 09:15:21 crc kubenswrapper[4687]: I0228 09:15:21.401457 4687 generic.go:334] "Generic (PLEG): container finished" podID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerID="759f710b76c4637f000dd2349d8c7a7cae35d00e258fc31920fa0e11ea3ab020" exitCode=0 Feb 28 09:15:21 crc kubenswrapper[4687]: I0228 09:15:21.401529 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" event={"ID":"993f721e-f5f5-4e7e-9896-5931bd6e0023","Type":"ContainerDied","Data":"759f710b76c4637f000dd2349d8c7a7cae35d00e258fc31920fa0e11ea3ab020"} Feb 28 09:15:21 crc kubenswrapper[4687]: I0228 09:15:21.401580 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" event={"ID":"993f721e-f5f5-4e7e-9896-5931bd6e0023","Type":"ContainerStarted","Data":"94f22ff9d483b339f3efd691857f82e7d9cde3888d7e62655864c4bdefdf7fba"} Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.666892 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9sll"] Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.668213 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.675223 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9sll"] Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.676626 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8hn\" (UniqueName: \"kubernetes.io/projected/fb90b643-2dc4-435a-af82-863cd351199a-kube-api-access-rj8hn\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.676869 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-catalog-content\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.677005 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-utilities\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.777908 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-catalog-content\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.777978 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-utilities\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.778051 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8hn\" (UniqueName: \"kubernetes.io/projected/fb90b643-2dc4-435a-af82-863cd351199a-kube-api-access-rj8hn\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.778690 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-catalog-content\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.779335 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-utilities\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.799607 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8hn\" (UniqueName: \"kubernetes.io/projected/fb90b643-2dc4-435a-af82-863cd351199a-kube-api-access-rj8hn\") pod \"redhat-operators-n9sll\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:22 crc kubenswrapper[4687]: I0228 09:15:22.987412 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.216883 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9sll"] Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.332463 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4m8kh" podUID="96e679f2-11c5-4ade-abc4-56a7b85a5668" containerName="console" containerID="cri-o://170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba" gracePeriod=15 Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.426678 4687 generic.go:334] "Generic (PLEG): container finished" podID="fb90b643-2dc4-435a-af82-863cd351199a" containerID="38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c" exitCode=0 Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.426782 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerDied","Data":"38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c"} Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.426856 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerStarted","Data":"06e6aab1717c1170d5e76b05726f6068a5c58b2225bb2fb9fb0efe6c1e1d2709"} Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.429622 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" event={"ID":"993f721e-f5f5-4e7e-9896-5931bd6e0023","Type":"ContainerDied","Data":"5e1210027693466ca8c3c3a1104cd6c29262e34e613db7a7b62ef24b3d5eb351"} Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.429554 4687 generic.go:334] "Generic (PLEG): container finished" podID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerID="5e1210027693466ca8c3c3a1104cd6c29262e34e613db7a7b62ef24b3d5eb351" exitCode=0 Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.621846 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4m8kh_96e679f2-11c5-4ade-abc4-56a7b85a5668/console/0.log" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.621912 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.720730 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-serving-cert\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.720787 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pppwd\" (UniqueName: \"kubernetes.io/projected/96e679f2-11c5-4ade-abc4-56a7b85a5668-kube-api-access-pppwd\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.720909 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-service-ca\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.720955 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-trusted-ca-bundle\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.720984 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-config\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.721014 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-oauth-serving-cert\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.721065 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-oauth-config\") pod \"96e679f2-11c5-4ade-abc4-56a7b85a5668\" (UID: \"96e679f2-11c5-4ade-abc4-56a7b85a5668\") " Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.721667 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.721680 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-service-ca" (OuterVolumeSpecName: "service-ca") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.721768 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.723053 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-config" (OuterVolumeSpecName: "console-config") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.726203 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e679f2-11c5-4ade-abc4-56a7b85a5668-kube-api-access-pppwd" (OuterVolumeSpecName: "kube-api-access-pppwd") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "kube-api-access-pppwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.727060 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.727316 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96e679f2-11c5-4ade-abc4-56a7b85a5668" (UID: "96e679f2-11c5-4ade-abc4-56a7b85a5668"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.822901 4687 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.823168 4687 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.823180 4687 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.823189 4687 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96e679f2-11c5-4ade-abc4-56a7b85a5668-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.823197 4687 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.823206 4687 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96e679f2-11c5-4ade-abc4-56a7b85a5668-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:23 crc kubenswrapper[4687]: I0228 09:15:23.823214 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pppwd\" (UniqueName: \"kubernetes.io/projected/96e679f2-11c5-4ade-abc4-56a7b85a5668-kube-api-access-pppwd\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.440989 4687 generic.go:334] "Generic (PLEG): container finished" podID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerID="f581fae95eed07c850c46ab63175181eaecd7de73df24c524212e85f2f81e6b3" exitCode=0 Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.441073 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" event={"ID":"993f721e-f5f5-4e7e-9896-5931bd6e0023","Type":"ContainerDied","Data":"f581fae95eed07c850c46ab63175181eaecd7de73df24c524212e85f2f81e6b3"} Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.443904 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4m8kh_96e679f2-11c5-4ade-abc4-56a7b85a5668/console/0.log" Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.443955 4687 generic.go:334] "Generic (PLEG): container finished" podID="96e679f2-11c5-4ade-abc4-56a7b85a5668" containerID="170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba" exitCode=2 Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.444050 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4m8kh" Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.444097 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m8kh" event={"ID":"96e679f2-11c5-4ade-abc4-56a7b85a5668","Type":"ContainerDied","Data":"170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba"} Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.444179 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4m8kh" event={"ID":"96e679f2-11c5-4ade-abc4-56a7b85a5668","Type":"ContainerDied","Data":"7cb40cdab6c6d995623e6531ade87d03b25c4923780123a6e174ffd2b135d5f6"} Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.444221 4687 scope.go:117] "RemoveContainer" containerID="170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba" Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.447391 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerStarted","Data":"cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703"} Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.462392 4687 scope.go:117] "RemoveContainer" containerID="170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba" Feb 28 09:15:24 crc kubenswrapper[4687]: E0228 09:15:24.462945 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba\": container with ID starting with 170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba not found: ID does not exist" containerID="170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba" Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.463000 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba"} err="failed to get container status \"170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba\": rpc error: code = NotFound desc = could not find container \"170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba\": container with ID starting with 170acaba5784a498318b9c514d6dd3588976fb12f0497dd619f19d8c29bbd2ba not found: ID does not exist" Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.474537 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4m8kh"] Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.477371 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4m8kh"] Feb 28 09:15:24 crc kubenswrapper[4687]: I0228 09:15:24.664637 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e679f2-11c5-4ade-abc4-56a7b85a5668" path="/var/lib/kubelet/pods/96e679f2-11c5-4ade-abc4-56a7b85a5668/volumes" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.458803 4687 generic.go:334] "Generic (PLEG): container finished" podID="fb90b643-2dc4-435a-af82-863cd351199a" containerID="cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703" exitCode=0 Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.458899 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerDied","Data":"cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703"} Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.676898 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.749190 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-util\") pod \"993f721e-f5f5-4e7e-9896-5931bd6e0023\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.749227 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-bundle\") pod \"993f721e-f5f5-4e7e-9896-5931bd6e0023\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.749334 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzs88\" (UniqueName: \"kubernetes.io/projected/993f721e-f5f5-4e7e-9896-5931bd6e0023-kube-api-access-hzs88\") pod \"993f721e-f5f5-4e7e-9896-5931bd6e0023\" (UID: \"993f721e-f5f5-4e7e-9896-5931bd6e0023\") " Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.750622 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-bundle" (OuterVolumeSpecName: "bundle") pod "993f721e-f5f5-4e7e-9896-5931bd6e0023" (UID: "993f721e-f5f5-4e7e-9896-5931bd6e0023"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.754449 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993f721e-f5f5-4e7e-9896-5931bd6e0023-kube-api-access-hzs88" (OuterVolumeSpecName: "kube-api-access-hzs88") pod "993f721e-f5f5-4e7e-9896-5931bd6e0023" (UID: "993f721e-f5f5-4e7e-9896-5931bd6e0023"). InnerVolumeSpecName "kube-api-access-hzs88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.759485 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-util" (OuterVolumeSpecName: "util") pod "993f721e-f5f5-4e7e-9896-5931bd6e0023" (UID: "993f721e-f5f5-4e7e-9896-5931bd6e0023"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.850561 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzs88\" (UniqueName: \"kubernetes.io/projected/993f721e-f5f5-4e7e-9896-5931bd6e0023-kube-api-access-hzs88\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.850610 4687 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:25 crc kubenswrapper[4687]: I0228 09:15:25.850626 4687 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/993f721e-f5f5-4e7e-9896-5931bd6e0023-util\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:26 crc kubenswrapper[4687]: I0228 09:15:26.468521 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" Feb 28 09:15:26 crc kubenswrapper[4687]: I0228 09:15:26.468550 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7" event={"ID":"993f721e-f5f5-4e7e-9896-5931bd6e0023","Type":"ContainerDied","Data":"94f22ff9d483b339f3efd691857f82e7d9cde3888d7e62655864c4bdefdf7fba"} Feb 28 09:15:26 crc kubenswrapper[4687]: I0228 09:15:26.469072 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94f22ff9d483b339f3efd691857f82e7d9cde3888d7e62655864c4bdefdf7fba" Feb 28 09:15:26 crc kubenswrapper[4687]: I0228 09:15:26.470689 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerStarted","Data":"8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924"} Feb 28 09:15:26 crc kubenswrapper[4687]: I0228 09:15:26.493170 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9sll" podStartSLOduration=1.98303125 podStartE2EDuration="4.493151271s" podCreationTimestamp="2026-02-28 09:15:22 +0000 UTC" firstStartedPulling="2026-02-28 09:15:23.429775502 +0000 UTC m=+715.120344839" lastFinishedPulling="2026-02-28 09:15:25.939895523 +0000 UTC m=+717.630464860" observedRunningTime="2026-02-28 09:15:26.492998123 +0000 UTC m=+718.183567460" watchObservedRunningTime="2026-02-28 09:15:26.493151271 +0000 UTC m=+718.183720608" Feb 28 09:15:32 crc kubenswrapper[4687]: I0228 09:15:32.988380 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:32 crc kubenswrapper[4687]: I0228 09:15:32.988911 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:33 crc kubenswrapper[4687]: I0228 09:15:33.023789 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:33 crc kubenswrapper[4687]: I0228 09:15:33.533647 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:35 crc kubenswrapper[4687]: I0228 09:15:35.059859 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9sll"] Feb 28 09:15:35 crc kubenswrapper[4687]: I0228 09:15:35.522735 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9sll" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="registry-server" containerID="cri-o://8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924" gracePeriod=2 Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.348001 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.381866 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-utilities\") pod \"fb90b643-2dc4-435a-af82-863cd351199a\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.381964 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-catalog-content\") pod \"fb90b643-2dc4-435a-af82-863cd351199a\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.382031 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj8hn\" (UniqueName: \"kubernetes.io/projected/fb90b643-2dc4-435a-af82-863cd351199a-kube-api-access-rj8hn\") pod \"fb90b643-2dc4-435a-af82-863cd351199a\" (UID: \"fb90b643-2dc4-435a-af82-863cd351199a\") " Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.382525 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-utilities" (OuterVolumeSpecName: "utilities") pod "fb90b643-2dc4-435a-af82-863cd351199a" (UID: "fb90b643-2dc4-435a-af82-863cd351199a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.387965 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb90b643-2dc4-435a-af82-863cd351199a-kube-api-access-rj8hn" (OuterVolumeSpecName: "kube-api-access-rj8hn") pod "fb90b643-2dc4-435a-af82-863cd351199a" (UID: "fb90b643-2dc4-435a-af82-863cd351199a"). InnerVolumeSpecName "kube-api-access-rj8hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.475530 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb90b643-2dc4-435a-af82-863cd351199a" (UID: "fb90b643-2dc4-435a-af82-863cd351199a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.483414 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.483443 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb90b643-2dc4-435a-af82-863cd351199a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.483457 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj8hn\" (UniqueName: \"kubernetes.io/projected/fb90b643-2dc4-435a-af82-863cd351199a-kube-api-access-rj8hn\") on node \"crc\" DevicePath \"\"" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.530108 4687 generic.go:334] "Generic (PLEG): container finished" podID="fb90b643-2dc4-435a-af82-863cd351199a" containerID="8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924" exitCode=0 Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.530154 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerDied","Data":"8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924"} Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.530182 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9sll" event={"ID":"fb90b643-2dc4-435a-af82-863cd351199a","Type":"ContainerDied","Data":"06e6aab1717c1170d5e76b05726f6068a5c58b2225bb2fb9fb0efe6c1e1d2709"} Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.530184 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9sll" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.530199 4687 scope.go:117] "RemoveContainer" containerID="8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.544209 4687 scope.go:117] "RemoveContainer" containerID="cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.558333 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9sll"] Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.558635 4687 scope.go:117] "RemoveContainer" containerID="38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.561175 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9sll"] Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.581551 4687 scope.go:117] "RemoveContainer" containerID="8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924" Feb 28 09:15:36 crc kubenswrapper[4687]: E0228 09:15:36.581917 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924\": container with ID starting with 8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924 not found: ID does not exist" containerID="8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.581971 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924"} err="failed to get container status \"8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924\": rpc error: code = NotFound desc = could not find container \"8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924\": container with ID starting with 8e11641a015fe9e24d59918db4defad6d0febf365cc32e40f9fd0853318b3924 not found: ID does not exist" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.582000 4687 scope.go:117] "RemoveContainer" containerID="cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703" Feb 28 09:15:36 crc kubenswrapper[4687]: E0228 09:15:36.582442 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703\": container with ID starting with cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703 not found: ID does not exist" containerID="cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.582482 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703"} err="failed to get container status \"cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703\": rpc error: code = NotFound desc = could not find container \"cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703\": container with ID starting with cde789d7b5e8068b56ea39564e3d8fe3b37fc0f4b9c93ff5d623a1fca793b703 not found: ID does not exist" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.582515 4687 scope.go:117] "RemoveContainer" containerID="38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c" Feb 28 09:15:36 crc kubenswrapper[4687]: E0228 09:15:36.582872 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c\": container with ID starting with 38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c not found: ID does not exist" containerID="38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.582901 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c"} err="failed to get container status \"38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c\": rpc error: code = NotFound desc = could not find container \"38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c\": container with ID starting with 38b6d0c5bc3706d5b7a232c7b6aab5439e9e5234fe14501ba365cc796b36a28c not found: ID does not exist" Feb 28 09:15:36 crc kubenswrapper[4687]: I0228 09:15:36.662762 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb90b643-2dc4-435a-af82-863cd351199a" path="/var/lib/kubelet/pods/fb90b643-2dc4-435a-af82-863cd351199a/volumes" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.054987 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4"] Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055379 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="extract" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055426 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="extract" Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055449 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="util" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055456 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="util" Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055467 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="extract-content" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055473 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="extract-content" Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055484 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e679f2-11c5-4ade-abc4-56a7b85a5668" containerName="console" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055491 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e679f2-11c5-4ade-abc4-56a7b85a5668" containerName="console" Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055501 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="pull" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055506 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="pull" Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055519 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="extract-utilities" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055529 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="extract-utilities" Feb 28 09:15:37 crc kubenswrapper[4687]: E0228 09:15:37.055540 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="registry-server" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055547 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="registry-server" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055661 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb90b643-2dc4-435a-af82-863cd351199a" containerName="registry-server" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055673 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="993f721e-f5f5-4e7e-9896-5931bd6e0023" containerName="extract" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.055682 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e679f2-11c5-4ade-abc4-56a7b85a5668" containerName="console" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.056297 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.058284 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.058625 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.062130 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fn7pm" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.065322 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.065524 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.068400 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4"] Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.094115 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzst7\" (UniqueName: \"kubernetes.io/projected/370a0b00-a4b2-428b-887b-5e0a7dce8d53-kube-api-access-rzst7\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.094159 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/370a0b00-a4b2-428b-887b-5e0a7dce8d53-apiservice-cert\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.094193 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/370a0b00-a4b2-428b-887b-5e0a7dce8d53-webhook-cert\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.195916 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzst7\" (UniqueName: \"kubernetes.io/projected/370a0b00-a4b2-428b-887b-5e0a7dce8d53-kube-api-access-rzst7\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.196114 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/370a0b00-a4b2-428b-887b-5e0a7dce8d53-apiservice-cert\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.196219 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/370a0b00-a4b2-428b-887b-5e0a7dce8d53-webhook-cert\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.200675 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/370a0b00-a4b2-428b-887b-5e0a7dce8d53-webhook-cert\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.200672 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/370a0b00-a4b2-428b-887b-5e0a7dce8d53-apiservice-cert\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.209856 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzst7\" (UniqueName: \"kubernetes.io/projected/370a0b00-a4b2-428b-887b-5e0a7dce8d53-kube-api-access-rzst7\") pod \"metallb-operator-controller-manager-6f7cb57fd8-p9bs4\" (UID: \"370a0b00-a4b2-428b-887b-5e0a7dce8d53\") " pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.303730 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb"] Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.304364 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.306388 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.306731 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wtzkn" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.310254 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.324310 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb"] Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.371671 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.400687 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp27r\" (UniqueName: \"kubernetes.io/projected/287a3bc3-7f28-47be-90ab-6b25ea27db38-kube-api-access-cp27r\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.400756 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/287a3bc3-7f28-47be-90ab-6b25ea27db38-apiservice-cert\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.400790 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/287a3bc3-7f28-47be-90ab-6b25ea27db38-webhook-cert\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.502643 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/287a3bc3-7f28-47be-90ab-6b25ea27db38-webhook-cert\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.502817 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp27r\" (UniqueName: \"kubernetes.io/projected/287a3bc3-7f28-47be-90ab-6b25ea27db38-kube-api-access-cp27r\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.502867 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/287a3bc3-7f28-47be-90ab-6b25ea27db38-apiservice-cert\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.507296 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/287a3bc3-7f28-47be-90ab-6b25ea27db38-apiservice-cert\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.507519 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/287a3bc3-7f28-47be-90ab-6b25ea27db38-webhook-cert\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.524229 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp27r\" (UniqueName: \"kubernetes.io/projected/287a3bc3-7f28-47be-90ab-6b25ea27db38-kube-api-access-cp27r\") pod \"metallb-operator-webhook-server-686bcc794c-5fsqb\" (UID: \"287a3bc3-7f28-47be-90ab-6b25ea27db38\") " pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.568522 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4"] Feb 28 09:15:37 crc kubenswrapper[4687]: W0228 09:15:37.573796 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod370a0b00_a4b2_428b_887b_5e0a7dce8d53.slice/crio-08bb0533022e9b1ee11a9963e8bd9039d49d35adfdaa634f9920328a4fb3bc7a WatchSource:0}: Error finding container 08bb0533022e9b1ee11a9963e8bd9039d49d35adfdaa634f9920328a4fb3bc7a: Status 404 returned error can't find the container with id 08bb0533022e9b1ee11a9963e8bd9039d49d35adfdaa634f9920328a4fb3bc7a Feb 28 09:15:37 crc kubenswrapper[4687]: I0228 09:15:37.617431 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:38 crc kubenswrapper[4687]: I0228 09:15:38.025240 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb"] Feb 28 09:15:38 crc kubenswrapper[4687]: I0228 09:15:38.542349 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" event={"ID":"287a3bc3-7f28-47be-90ab-6b25ea27db38","Type":"ContainerStarted","Data":"14eb2c8620ebf2baee9ecb9899605237f5bfba4eacb286f771d03458b4fa72f9"} Feb 28 09:15:38 crc kubenswrapper[4687]: I0228 09:15:38.543499 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" event={"ID":"370a0b00-a4b2-428b-887b-5e0a7dce8d53","Type":"ContainerStarted","Data":"08bb0533022e9b1ee11a9963e8bd9039d49d35adfdaa634f9920328a4fb3bc7a"} Feb 28 09:15:40 crc kubenswrapper[4687]: I0228 09:15:40.557388 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" event={"ID":"370a0b00-a4b2-428b-887b-5e0a7dce8d53","Type":"ContainerStarted","Data":"d4b3c5eece83470cfe5651e8a1a7ae62ef59d5d8a030febe859301a92095953e"} Feb 28 09:15:40 crc kubenswrapper[4687]: I0228 09:15:40.557656 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:15:40 crc kubenswrapper[4687]: I0228 09:15:40.573769 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" podStartSLOduration=0.870944181 podStartE2EDuration="3.573745077s" podCreationTimestamp="2026-02-28 09:15:37 +0000 UTC" firstStartedPulling="2026-02-28 09:15:37.576275951 +0000 UTC m=+729.266845288" lastFinishedPulling="2026-02-28 09:15:40.279076847 +0000 UTC m=+731.969646184" observedRunningTime="2026-02-28 09:15:40.570998281 +0000 UTC m=+732.261567628" watchObservedRunningTime="2026-02-28 09:15:40.573745077 +0000 UTC m=+732.264314414" Feb 28 09:15:42 crc kubenswrapper[4687]: I0228 09:15:42.573300 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" event={"ID":"287a3bc3-7f28-47be-90ab-6b25ea27db38","Type":"ContainerStarted","Data":"872cd0524d40169854f3e1e2d3294147fd9f76d78c3383b0511337a3774c9f07"} Feb 28 09:15:42 crc kubenswrapper[4687]: I0228 09:15:42.573891 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:15:42 crc kubenswrapper[4687]: I0228 09:15:42.596124 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" podStartSLOduration=1.959934342 podStartE2EDuration="5.59610087s" podCreationTimestamp="2026-02-28 09:15:37 +0000 UTC" firstStartedPulling="2026-02-28 09:15:38.0314816 +0000 UTC m=+729.722050938" lastFinishedPulling="2026-02-28 09:15:41.667648129 +0000 UTC m=+733.358217466" observedRunningTime="2026-02-28 09:15:42.590557945 +0000 UTC m=+734.281127282" watchObservedRunningTime="2026-02-28 09:15:42.59610087 +0000 UTC m=+734.286670207" Feb 28 09:15:55 crc kubenswrapper[4687]: I0228 09:15:55.002290 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:15:55 crc kubenswrapper[4687]: I0228 09:15:55.002594 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:15:57 crc kubenswrapper[4687]: I0228 09:15:57.622301 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-686bcc794c-5fsqb" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.129686 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537836-jgbxm"] Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.130568 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.134311 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.134331 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.136054 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537836-jgbxm"] Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.136971 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.233588 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlpc7\" (UniqueName: \"kubernetes.io/projected/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a-kube-api-access-vlpc7\") pod \"auto-csr-approver-29537836-jgbxm\" (UID: \"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a\") " pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.334913 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlpc7\" (UniqueName: \"kubernetes.io/projected/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a-kube-api-access-vlpc7\") pod \"auto-csr-approver-29537836-jgbxm\" (UID: \"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a\") " pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.351619 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlpc7\" (UniqueName: \"kubernetes.io/projected/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a-kube-api-access-vlpc7\") pod \"auto-csr-approver-29537836-jgbxm\" (UID: \"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a\") " pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.443978 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:00 crc kubenswrapper[4687]: I0228 09:16:00.872779 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537836-jgbxm"] Feb 28 09:16:01 crc kubenswrapper[4687]: I0228 09:16:01.669376 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" event={"ID":"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a","Type":"ContainerStarted","Data":"ef1fa4840e71d24f967edbe0451a56bb5668e1b59fa103589c1407ac0bf8befd"} Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.580087 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lk5vw"] Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.581560 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.587309 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lk5vw"] Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.666555 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-utilities\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.666614 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46mhp\" (UniqueName: \"kubernetes.io/projected/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-kube-api-access-46mhp\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.666683 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-catalog-content\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.676149 4687 generic.go:334] "Generic (PLEG): container finished" podID="cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a" containerID="3e9f05e5348087ae22fa99f6c6e7c935f7a19de5011e2a8e76f694474f6eb090" exitCode=0 Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.676193 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" event={"ID":"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a","Type":"ContainerDied","Data":"3e9f05e5348087ae22fa99f6c6e7c935f7a19de5011e2a8e76f694474f6eb090"} Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.767959 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-catalog-content\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.768158 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-utilities\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.768263 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46mhp\" (UniqueName: \"kubernetes.io/projected/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-kube-api-access-46mhp\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.768410 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-catalog-content\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.768668 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-utilities\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.784051 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46mhp\" (UniqueName: \"kubernetes.io/projected/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-kube-api-access-46mhp\") pod \"certified-operators-lk5vw\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:02 crc kubenswrapper[4687]: I0228 09:16:02.898186 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.287815 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lk5vw"] Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.683681 4687 generic.go:334] "Generic (PLEG): container finished" podID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerID="10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03" exitCode=0 Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.683783 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerDied","Data":"10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03"} Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.684778 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerStarted","Data":"1cea0655fe1dd0786ab327c9404d2882b0d245c403f5ea64ccd619d30c4de0a7"} Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.860521 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.982531 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlpc7\" (UniqueName: \"kubernetes.io/projected/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a-kube-api-access-vlpc7\") pod \"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a\" (UID: \"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a\") " Feb 28 09:16:03 crc kubenswrapper[4687]: I0228 09:16:03.987094 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a-kube-api-access-vlpc7" (OuterVolumeSpecName: "kube-api-access-vlpc7") pod "cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a" (UID: "cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a"). InnerVolumeSpecName "kube-api-access-vlpc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.084114 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlpc7\" (UniqueName: \"kubernetes.io/projected/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a-kube-api-access-vlpc7\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.690876 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" event={"ID":"cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a","Type":"ContainerDied","Data":"ef1fa4840e71d24f967edbe0451a56bb5668e1b59fa103589c1407ac0bf8befd"} Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.691111 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1fa4840e71d24f967edbe0451a56bb5668e1b59fa103589c1407ac0bf8befd" Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.690902 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537836-jgbxm" Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.693128 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerStarted","Data":"7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72"} Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.901354 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537830-99xt9"] Feb 28 09:16:04 crc kubenswrapper[4687]: I0228 09:16:04.904060 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537830-99xt9"] Feb 28 09:16:05 crc kubenswrapper[4687]: I0228 09:16:05.700603 4687 generic.go:334] "Generic (PLEG): container finished" podID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerID="7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72" exitCode=0 Feb 28 09:16:05 crc kubenswrapper[4687]: I0228 09:16:05.700646 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerDied","Data":"7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72"} Feb 28 09:16:06 crc kubenswrapper[4687]: I0228 09:16:06.664292 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca959e0-750a-4677-ab35-59ff3b0c6d5b" path="/var/lib/kubelet/pods/aca959e0-750a-4677-ab35-59ff3b0c6d5b/volumes" Feb 28 09:16:06 crc kubenswrapper[4687]: I0228 09:16:06.706696 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerStarted","Data":"c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb"} Feb 28 09:16:06 crc kubenswrapper[4687]: I0228 09:16:06.721085 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lk5vw" podStartSLOduration=2.196064182 podStartE2EDuration="4.721075029s" podCreationTimestamp="2026-02-28 09:16:02 +0000 UTC" firstStartedPulling="2026-02-28 09:16:03.685444527 +0000 UTC m=+755.376013864" lastFinishedPulling="2026-02-28 09:16:06.210455373 +0000 UTC m=+757.901024711" observedRunningTime="2026-02-28 09:16:06.719340256 +0000 UTC m=+758.409909594" watchObservedRunningTime="2026-02-28 09:16:06.721075029 +0000 UTC m=+758.411644366" Feb 28 09:16:12 crc kubenswrapper[4687]: I0228 09:16:12.898474 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:12 crc kubenswrapper[4687]: I0228 09:16:12.900156 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:12 crc kubenswrapper[4687]: I0228 09:16:12.937324 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:13 crc kubenswrapper[4687]: I0228 09:16:13.771671 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:15 crc kubenswrapper[4687]: I0228 09:16:15.159597 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lk5vw"] Feb 28 09:16:15 crc kubenswrapper[4687]: I0228 09:16:15.755063 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lk5vw" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="registry-server" containerID="cri-o://c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb" gracePeriod=2 Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.063406 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.240284 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-utilities\") pod \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.240439 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-catalog-content\") pod \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.240520 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46mhp\" (UniqueName: \"kubernetes.io/projected/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-kube-api-access-46mhp\") pod \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\" (UID: \"c11f84fc-38fa-42af-9a8b-daefd98ccd7b\") " Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.241342 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-utilities" (OuterVolumeSpecName: "utilities") pod "c11f84fc-38fa-42af-9a8b-daefd98ccd7b" (UID: "c11f84fc-38fa-42af-9a8b-daefd98ccd7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.246437 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-kube-api-access-46mhp" (OuterVolumeSpecName: "kube-api-access-46mhp") pod "c11f84fc-38fa-42af-9a8b-daefd98ccd7b" (UID: "c11f84fc-38fa-42af-9a8b-daefd98ccd7b"). InnerVolumeSpecName "kube-api-access-46mhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.281512 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c11f84fc-38fa-42af-9a8b-daefd98ccd7b" (UID: "c11f84fc-38fa-42af-9a8b-daefd98ccd7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.342758 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.342796 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.342809 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46mhp\" (UniqueName: \"kubernetes.io/projected/c11f84fc-38fa-42af-9a8b-daefd98ccd7b-kube-api-access-46mhp\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.760986 4687 generic.go:334] "Generic (PLEG): container finished" podID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerID="c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb" exitCode=0 Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.761057 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerDied","Data":"c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb"} Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.761092 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lk5vw" event={"ID":"c11f84fc-38fa-42af-9a8b-daefd98ccd7b","Type":"ContainerDied","Data":"1cea0655fe1dd0786ab327c9404d2882b0d245c403f5ea64ccd619d30c4de0a7"} Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.761104 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lk5vw" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.761111 4687 scope.go:117] "RemoveContainer" containerID="c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.778990 4687 scope.go:117] "RemoveContainer" containerID="7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.779662 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lk5vw"] Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.782180 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lk5vw"] Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.792626 4687 scope.go:117] "RemoveContainer" containerID="10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.805318 4687 scope.go:117] "RemoveContainer" containerID="c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb" Feb 28 09:16:16 crc kubenswrapper[4687]: E0228 09:16:16.806062 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb\": container with ID starting with c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb not found: ID does not exist" containerID="c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.806111 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb"} err="failed to get container status \"c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb\": rpc error: code = NotFound desc = could not find container \"c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb\": container with ID starting with c3d7c6158a0ae1c4585dd3a2429e14458e12d11af8ee3ae042a1fe8754f187cb not found: ID does not exist" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.806143 4687 scope.go:117] "RemoveContainer" containerID="7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72" Feb 28 09:16:16 crc kubenswrapper[4687]: E0228 09:16:16.806456 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72\": container with ID starting with 7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72 not found: ID does not exist" containerID="7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.806498 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72"} err="failed to get container status \"7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72\": rpc error: code = NotFound desc = could not find container \"7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72\": container with ID starting with 7ff570b271cb6a2a5d1e96feb74dc04ee5c8f72d3077c3a6a49a64fad8ff4b72 not found: ID does not exist" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.806525 4687 scope.go:117] "RemoveContainer" containerID="10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03" Feb 28 09:16:16 crc kubenswrapper[4687]: E0228 09:16:16.806847 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03\": container with ID starting with 10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03 not found: ID does not exist" containerID="10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03" Feb 28 09:16:16 crc kubenswrapper[4687]: I0228 09:16:16.806883 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03"} err="failed to get container status \"10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03\": rpc error: code = NotFound desc = could not find container \"10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03\": container with ID starting with 10cb070e1f85f49d00657d1faa2b66ea3cd3dc5ba6a0b5c1fd64364a66191f03 not found: ID does not exist" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.376200 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6f7cb57fd8-p9bs4" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908340 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg"] Feb 28 09:16:17 crc kubenswrapper[4687]: E0228 09:16:17.908660 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a" containerName="oc" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908708 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a" containerName="oc" Feb 28 09:16:17 crc kubenswrapper[4687]: E0228 09:16:17.908722 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="registry-server" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908730 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="registry-server" Feb 28 09:16:17 crc kubenswrapper[4687]: E0228 09:16:17.908744 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="extract-utilities" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908752 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="extract-utilities" Feb 28 09:16:17 crc kubenswrapper[4687]: E0228 09:16:17.908765 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="extract-content" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908772 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="extract-content" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908899 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a" containerName="oc" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.908913 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" containerName="registry-server" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.909516 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.911657 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-pvjfk" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.914092 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.916396 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lnfnj"] Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.919293 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.921010 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.923772 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.928391 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg"] Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.965996 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bnlzc"] Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.966710 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bnlzc" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.968835 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.968904 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.969671 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9xpwr" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.972830 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.985100 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-tqhsm"] Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.985941 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:17 crc kubenswrapper[4687]: I0228 09:16:17.986972 4687 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.002101 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-tqhsm"] Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067426 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-conf\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067485 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-startup\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067560 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-metrics-certs\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067579 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98lr6\" (UniqueName: \"kubernetes.io/projected/abbd4948-5005-4b4b-b0eb-de72a0b28860-kube-api-access-98lr6\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067614 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-reloader\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067641 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-metallb-excludel2\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067658 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-sockets\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067680 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5djls\" (UniqueName: \"kubernetes.io/projected/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-kube-api-access-5djls\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067698 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-metrics\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067716 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067746 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5df08eed-eb11-482c-95aa-daebcccec8a8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-qxhmg\" (UID: \"5df08eed-eb11-482c-95aa-daebcccec8a8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067778 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l79q\" (UniqueName: \"kubernetes.io/projected/5df08eed-eb11-482c-95aa-daebcccec8a8-kube-api-access-8l79q\") pod \"frr-k8s-webhook-server-7f989f654f-qxhmg\" (UID: \"5df08eed-eb11-482c-95aa-daebcccec8a8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.067796 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abbd4948-5005-4b4b-b0eb-de72a0b28860-metrics-certs\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.169780 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-metrics-certs\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.169829 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98lr6\" (UniqueName: \"kubernetes.io/projected/abbd4948-5005-4b4b-b0eb-de72a0b28860-kube-api-access-98lr6\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.169867 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkt5k\" (UniqueName: \"kubernetes.io/projected/87d609a5-fd9a-4473-80e4-b94dc583b438-kube-api-access-bkt5k\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.169924 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-reloader\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.169977 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-metallb-excludel2\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.169997 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-sockets\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170043 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5djls\" (UniqueName: \"kubernetes.io/projected/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-kube-api-access-5djls\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170068 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170087 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-metrics\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170124 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5df08eed-eb11-482c-95aa-daebcccec8a8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-qxhmg\" (UID: \"5df08eed-eb11-482c-95aa-daebcccec8a8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170158 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87d609a5-fd9a-4473-80e4-b94dc583b438-metrics-certs\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170179 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l79q\" (UniqueName: \"kubernetes.io/projected/5df08eed-eb11-482c-95aa-daebcccec8a8-kube-api-access-8l79q\") pod \"frr-k8s-webhook-server-7f989f654f-qxhmg\" (UID: \"5df08eed-eb11-482c-95aa-daebcccec8a8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170201 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abbd4948-5005-4b4b-b0eb-de72a0b28860-metrics-certs\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170226 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-conf\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170252 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-startup\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170278 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87d609a5-fd9a-4473-80e4-b94dc583b438-cert\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170316 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-reloader\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.170548 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-conf\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: E0228 09:16:18.170919 4687 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 28 09:16:18 crc kubenswrapper[4687]: E0228 09:16:18.171092 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist podName:bdaf3bdc-4287-4a7c-9156-613b50d6afcc nodeName:}" failed. No retries permitted until 2026-02-28 09:16:18.671066284 +0000 UTC m=+770.361635620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist") pod "speaker-bnlzc" (UID: "bdaf3bdc-4287-4a7c-9156-613b50d6afcc") : secret "metallb-memberlist" not found Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.171246 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-startup\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.171468 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-metrics\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.171506 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-metallb-excludel2\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.171628 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/abbd4948-5005-4b4b-b0eb-de72a0b28860-frr-sockets\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.175054 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5df08eed-eb11-482c-95aa-daebcccec8a8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-qxhmg\" (UID: \"5df08eed-eb11-482c-95aa-daebcccec8a8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.181831 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abbd4948-5005-4b4b-b0eb-de72a0b28860-metrics-certs\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.182300 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-metrics-certs\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.183679 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l79q\" (UniqueName: \"kubernetes.io/projected/5df08eed-eb11-482c-95aa-daebcccec8a8-kube-api-access-8l79q\") pod \"frr-k8s-webhook-server-7f989f654f-qxhmg\" (UID: \"5df08eed-eb11-482c-95aa-daebcccec8a8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.190509 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98lr6\" (UniqueName: \"kubernetes.io/projected/abbd4948-5005-4b4b-b0eb-de72a0b28860-kube-api-access-98lr6\") pod \"frr-k8s-lnfnj\" (UID: \"abbd4948-5005-4b4b-b0eb-de72a0b28860\") " pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.191196 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5djls\" (UniqueName: \"kubernetes.io/projected/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-kube-api-access-5djls\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.223244 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.231770 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.271741 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87d609a5-fd9a-4473-80e4-b94dc583b438-metrics-certs\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.271805 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87d609a5-fd9a-4473-80e4-b94dc583b438-cert\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.271841 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkt5k\" (UniqueName: \"kubernetes.io/projected/87d609a5-fd9a-4473-80e4-b94dc583b438-kube-api-access-bkt5k\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.276198 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87d609a5-fd9a-4473-80e4-b94dc583b438-metrics-certs\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.276423 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87d609a5-fd9a-4473-80e4-b94dc583b438-cert\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.293371 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkt5k\" (UniqueName: \"kubernetes.io/projected/87d609a5-fd9a-4473-80e4-b94dc583b438-kube-api-access-bkt5k\") pod \"controller-86ddb6bd46-tqhsm\" (UID: \"87d609a5-fd9a-4473-80e4-b94dc583b438\") " pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.295240 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.662791 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11f84fc-38fa-42af-9a8b-daefd98ccd7b" path="/var/lib/kubelet/pods/c11f84fc-38fa-42af-9a8b-daefd98ccd7b/volumes" Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.677755 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:18 crc kubenswrapper[4687]: E0228 09:16:18.677941 4687 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 28 09:16:18 crc kubenswrapper[4687]: E0228 09:16:18.678014 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist podName:bdaf3bdc-4287-4a7c-9156-613b50d6afcc nodeName:}" failed. No retries permitted until 2026-02-28 09:16:19.677996258 +0000 UTC m=+771.368565595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist") pod "speaker-bnlzc" (UID: "bdaf3bdc-4287-4a7c-9156-613b50d6afcc") : secret "metallb-memberlist" not found Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.723304 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg"] Feb 28 09:16:18 crc kubenswrapper[4687]: W0228 09:16:18.727329 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df08eed_eb11_482c_95aa_daebcccec8a8.slice/crio-1ac90efcbe27130f62313e368c9f9aa1949b3639a8380cd28520f4459f088c4e WatchSource:0}: Error finding container 1ac90efcbe27130f62313e368c9f9aa1949b3639a8380cd28520f4459f088c4e: Status 404 returned error can't find the container with id 1ac90efcbe27130f62313e368c9f9aa1949b3639a8380cd28520f4459f088c4e Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.759002 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-tqhsm"] Feb 28 09:16:18 crc kubenswrapper[4687]: W0228 09:16:18.760055 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87d609a5_fd9a_4473_80e4_b94dc583b438.slice/crio-3939549afca1cbe25accc1d98aab421a3c0ef2fabde0c15dba1535107bfc551c WatchSource:0}: Error finding container 3939549afca1cbe25accc1d98aab421a3c0ef2fabde0c15dba1535107bfc551c: Status 404 returned error can't find the container with id 3939549afca1cbe25accc1d98aab421a3c0ef2fabde0c15dba1535107bfc551c Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.776288 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"ce8e7f7b747fadece297173de22f58d2f3c49ab56a3c20bb1f92e735d6af3566"} Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.777451 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" event={"ID":"5df08eed-eb11-482c-95aa-daebcccec8a8","Type":"ContainerStarted","Data":"1ac90efcbe27130f62313e368c9f9aa1949b3639a8380cd28520f4459f088c4e"} Feb 28 09:16:18 crc kubenswrapper[4687]: I0228 09:16:18.779038 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-tqhsm" event={"ID":"87d609a5-fd9a-4473-80e4-b94dc583b438","Type":"ContainerStarted","Data":"3939549afca1cbe25accc1d98aab421a3c0ef2fabde0c15dba1535107bfc551c"} Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.689923 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.697253 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bdaf3bdc-4287-4a7c-9156-613b50d6afcc-memberlist\") pod \"speaker-bnlzc\" (UID: \"bdaf3bdc-4287-4a7c-9156-613b50d6afcc\") " pod="metallb-system/speaker-bnlzc" Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.777562 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bnlzc" Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.793567 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-tqhsm" event={"ID":"87d609a5-fd9a-4473-80e4-b94dc583b438","Type":"ContainerStarted","Data":"fb05ed8eacd2cc519a0d1ede28dc1f42544ae0231c6be6cdc07c707c2e09a6bb"} Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.793620 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-tqhsm" event={"ID":"87d609a5-fd9a-4473-80e4-b94dc583b438","Type":"ContainerStarted","Data":"1f837ed6eb25e3072e2891e1b591c8195bd9fea61bc877c1ee6845571afc77e2"} Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.794117 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:19 crc kubenswrapper[4687]: W0228 09:16:19.799210 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdaf3bdc_4287_4a7c_9156_613b50d6afcc.slice/crio-f0bd401c1b301c11e4c3ab8c377deeffe22fc724fa636a2c6cee9d78d1de3c90 WatchSource:0}: Error finding container f0bd401c1b301c11e4c3ab8c377deeffe22fc724fa636a2c6cee9d78d1de3c90: Status 404 returned error can't find the container with id f0bd401c1b301c11e4c3ab8c377deeffe22fc724fa636a2c6cee9d78d1de3c90 Feb 28 09:16:19 crc kubenswrapper[4687]: I0228 09:16:19.814651 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-tqhsm" podStartSLOduration=2.814630992 podStartE2EDuration="2.814630992s" podCreationTimestamp="2026-02-28 09:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:16:19.809798152 +0000 UTC m=+771.500367489" watchObservedRunningTime="2026-02-28 09:16:19.814630992 +0000 UTC m=+771.505200329" Feb 28 09:16:20 crc kubenswrapper[4687]: I0228 09:16:20.803590 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bnlzc" event={"ID":"bdaf3bdc-4287-4a7c-9156-613b50d6afcc","Type":"ContainerStarted","Data":"b97a65dd24a2abff7f55438930ae333bc9909e98db359eb0aca15cbb0d82fce0"} Feb 28 09:16:20 crc kubenswrapper[4687]: I0228 09:16:20.803630 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bnlzc" event={"ID":"bdaf3bdc-4287-4a7c-9156-613b50d6afcc","Type":"ContainerStarted","Data":"1a953d8d12b3f0b047009a53834f32f93e5caff1b37cded98391a40fc2a85c25"} Feb 28 09:16:20 crc kubenswrapper[4687]: I0228 09:16:20.803642 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bnlzc" event={"ID":"bdaf3bdc-4287-4a7c-9156-613b50d6afcc","Type":"ContainerStarted","Data":"f0bd401c1b301c11e4c3ab8c377deeffe22fc724fa636a2c6cee9d78d1de3c90"} Feb 28 09:16:20 crc kubenswrapper[4687]: I0228 09:16:20.803764 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bnlzc" Feb 28 09:16:24 crc kubenswrapper[4687]: I0228 09:16:24.831058 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" event={"ID":"5df08eed-eb11-482c-95aa-daebcccec8a8","Type":"ContainerStarted","Data":"89c346e716f10d28d1352de79c863b922145aeaa82617a45b3077641478254ab"} Feb 28 09:16:24 crc kubenswrapper[4687]: I0228 09:16:24.831653 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:24 crc kubenswrapper[4687]: I0228 09:16:24.833927 4687 generic.go:334] "Generic (PLEG): container finished" podID="abbd4948-5005-4b4b-b0eb-de72a0b28860" containerID="4d7a9c73e6e737c0b2af8ee0a5412c01c7d4f4d1112098ab05415bb50b827469" exitCode=0 Feb 28 09:16:24 crc kubenswrapper[4687]: I0228 09:16:24.833984 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerDied","Data":"4d7a9c73e6e737c0b2af8ee0a5412c01c7d4f4d1112098ab05415bb50b827469"} Feb 28 09:16:24 crc kubenswrapper[4687]: I0228 09:16:24.847687 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bnlzc" podStartSLOduration=7.847674263 podStartE2EDuration="7.847674263s" podCreationTimestamp="2026-02-28 09:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:16:20.8192333 +0000 UTC m=+772.509802647" watchObservedRunningTime="2026-02-28 09:16:24.847674263 +0000 UTC m=+776.538243599" Feb 28 09:16:24 crc kubenswrapper[4687]: I0228 09:16:24.849247 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" podStartSLOduration=2.014509703 podStartE2EDuration="7.849242983s" podCreationTimestamp="2026-02-28 09:16:17 +0000 UTC" firstStartedPulling="2026-02-28 09:16:18.729848575 +0000 UTC m=+770.420417912" lastFinishedPulling="2026-02-28 09:16:24.564581854 +0000 UTC m=+776.255151192" observedRunningTime="2026-02-28 09:16:24.844903211 +0000 UTC m=+776.535472547" watchObservedRunningTime="2026-02-28 09:16:24.849242983 +0000 UTC m=+776.539812320" Feb 28 09:16:25 crc kubenswrapper[4687]: I0228 09:16:25.002532 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:16:25 crc kubenswrapper[4687]: I0228 09:16:25.002669 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:16:25 crc kubenswrapper[4687]: I0228 09:16:25.840203 4687 generic.go:334] "Generic (PLEG): container finished" podID="abbd4948-5005-4b4b-b0eb-de72a0b28860" containerID="d76ebf72dee451dfdb9a6ce11257bcd7648e71ac1c6de7291f9b5d3c0ba38227" exitCode=0 Feb 28 09:16:25 crc kubenswrapper[4687]: I0228 09:16:25.840295 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerDied","Data":"d76ebf72dee451dfdb9a6ce11257bcd7648e71ac1c6de7291f9b5d3c0ba38227"} Feb 28 09:16:26 crc kubenswrapper[4687]: I0228 09:16:26.850949 4687 generic.go:334] "Generic (PLEG): container finished" podID="abbd4948-5005-4b4b-b0eb-de72a0b28860" containerID="b08d850d600fb931e0678ed7f232dd0797fdef645b50e0cb0d17baf2134ef1bb" exitCode=0 Feb 28 09:16:26 crc kubenswrapper[4687]: I0228 09:16:26.851117 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerDied","Data":"b08d850d600fb931e0678ed7f232dd0797fdef645b50e0cb0d17baf2134ef1bb"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.869802 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"bc56d73d548193c3bbe88a424f8a4daab383182344e1191fe6b4b544c9d0f5bc"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.870344 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.870359 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"7d183fd5979baf233f941d93b13db05d8cce5fbd9cfbda5d06306ef64ceaad15"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.870372 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"6c4ce2282c9065ed6395180f9b6840f5522cbd966230347a110b501e40371732"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.870382 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"b8d357c02b23e01dc5ccfd250c8120cb839be9cf351b594953e1b5c8beccdab7"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.870392 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"6a04d786d6feaca65676413a8fae8e5603bd9fbbe824e5891bd495e7d88cef14"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.870401 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lnfnj" event={"ID":"abbd4948-5005-4b4b-b0eb-de72a0b28860","Type":"ContainerStarted","Data":"56b51f7a018e7cb23f71ace710cf0cdbe94ad0bd50ad52d3acb70c74e67dbaf3"} Feb 28 09:16:27 crc kubenswrapper[4687]: I0228 09:16:27.892109 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lnfnj" podStartSLOduration=4.776399839 podStartE2EDuration="10.892092804s" podCreationTimestamp="2026-02-28 09:16:17 +0000 UTC" firstStartedPulling="2026-02-28 09:16:18.44393436 +0000 UTC m=+770.134503698" lastFinishedPulling="2026-02-28 09:16:24.559627326 +0000 UTC m=+776.250196663" observedRunningTime="2026-02-28 09:16:27.88956544 +0000 UTC m=+779.580134787" watchObservedRunningTime="2026-02-28 09:16:27.892092804 +0000 UTC m=+779.582662141" Feb 28 09:16:28 crc kubenswrapper[4687]: I0228 09:16:28.232814 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:28 crc kubenswrapper[4687]: I0228 09:16:28.262741 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:28 crc kubenswrapper[4687]: I0228 09:16:28.299923 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-tqhsm" Feb 28 09:16:29 crc kubenswrapper[4687]: I0228 09:16:29.781385 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bnlzc" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.043841 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-69x47"] Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.044943 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.047217 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.047713 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.047963 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ld57b" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.057165 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-69x47"] Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.067211 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5nv\" (UniqueName: \"kubernetes.io/projected/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9-kube-api-access-pp5nv\") pod \"openstack-operator-index-69x47\" (UID: \"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9\") " pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.168810 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp5nv\" (UniqueName: \"kubernetes.io/projected/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9-kube-api-access-pp5nv\") pod \"openstack-operator-index-69x47\" (UID: \"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9\") " pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.187521 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp5nv\" (UniqueName: \"kubernetes.io/projected/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9-kube-api-access-pp5nv\") pod \"openstack-operator-index-69x47\" (UID: \"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9\") " pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.358800 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.720496 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-69x47"] Feb 28 09:16:32 crc kubenswrapper[4687]: W0228 09:16:32.725564 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad66e91_ca38_4a4f_9f4e_286cdfd930e9.slice/crio-c58d8643d6ddfae32ef1de3d116fe59a155890bef22fff0cf2ff57e122cfa123 WatchSource:0}: Error finding container c58d8643d6ddfae32ef1de3d116fe59a155890bef22fff0cf2ff57e122cfa123: Status 404 returned error can't find the container with id c58d8643d6ddfae32ef1de3d116fe59a155890bef22fff0cf2ff57e122cfa123 Feb 28 09:16:32 crc kubenswrapper[4687]: I0228 09:16:32.904970 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-69x47" event={"ID":"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9","Type":"ContainerStarted","Data":"c58d8643d6ddfae32ef1de3d116fe59a155890bef22fff0cf2ff57e122cfa123"} Feb 28 09:16:33 crc kubenswrapper[4687]: I0228 09:16:33.912065 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-69x47" event={"ID":"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9","Type":"ContainerStarted","Data":"4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81"} Feb 28 09:16:33 crc kubenswrapper[4687]: I0228 09:16:33.928953 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-69x47" podStartSLOduration=1.06641212 podStartE2EDuration="1.928936415s" podCreationTimestamp="2026-02-28 09:16:32 +0000 UTC" firstStartedPulling="2026-02-28 09:16:32.728114921 +0000 UTC m=+784.418684257" lastFinishedPulling="2026-02-28 09:16:33.590639215 +0000 UTC m=+785.281208552" observedRunningTime="2026-02-28 09:16:33.923387398 +0000 UTC m=+785.613956734" watchObservedRunningTime="2026-02-28 09:16:33.928936415 +0000 UTC m=+785.619505753" Feb 28 09:16:35 crc kubenswrapper[4687]: I0228 09:16:35.425364 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-69x47"] Feb 28 09:16:35 crc kubenswrapper[4687]: I0228 09:16:35.924805 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-69x47" podUID="5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" containerName="registry-server" containerID="cri-o://4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81" gracePeriod=2 Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.035091 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qtbgc"] Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.036334 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.040041 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qtbgc"] Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.219284 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prmmn\" (UniqueName: \"kubernetes.io/projected/c15f16ef-addd-4cba-b2c3-69b4691fa2c7-kube-api-access-prmmn\") pod \"openstack-operator-index-qtbgc\" (UID: \"c15f16ef-addd-4cba-b2c3-69b4691fa2c7\") " pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.227509 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.320323 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prmmn\" (UniqueName: \"kubernetes.io/projected/c15f16ef-addd-4cba-b2c3-69b4691fa2c7-kube-api-access-prmmn\") pod \"openstack-operator-index-qtbgc\" (UID: \"c15f16ef-addd-4cba-b2c3-69b4691fa2c7\") " pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.338903 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prmmn\" (UniqueName: \"kubernetes.io/projected/c15f16ef-addd-4cba-b2c3-69b4691fa2c7-kube-api-access-prmmn\") pod \"openstack-operator-index-qtbgc\" (UID: \"c15f16ef-addd-4cba-b2c3-69b4691fa2c7\") " pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.358119 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.421174 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp5nv\" (UniqueName: \"kubernetes.io/projected/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9-kube-api-access-pp5nv\") pod \"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9\" (UID: \"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9\") " Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.424147 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9-kube-api-access-pp5nv" (OuterVolumeSpecName: "kube-api-access-pp5nv") pod "5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" (UID: "5ad66e91-ca38-4a4f-9f4e-286cdfd930e9"). InnerVolumeSpecName "kube-api-access-pp5nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.524402 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp5nv\" (UniqueName: \"kubernetes.io/projected/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9-kube-api-access-pp5nv\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.722675 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qtbgc"] Feb 28 09:16:36 crc kubenswrapper[4687]: W0228 09:16:36.731631 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc15f16ef_addd_4cba_b2c3_69b4691fa2c7.slice/crio-0f12da81afef44df85ff62bda637fbd86a692f18d533c7242143952e77f69a3b WatchSource:0}: Error finding container 0f12da81afef44df85ff62bda637fbd86a692f18d533c7242143952e77f69a3b: Status 404 returned error can't find the container with id 0f12da81afef44df85ff62bda637fbd86a692f18d533c7242143952e77f69a3b Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.933117 4687 generic.go:334] "Generic (PLEG): container finished" podID="5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" containerID="4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81" exitCode=0 Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.933188 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-69x47" event={"ID":"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9","Type":"ContainerDied","Data":"4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81"} Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.933222 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-69x47" event={"ID":"5ad66e91-ca38-4a4f-9f4e-286cdfd930e9","Type":"ContainerDied","Data":"c58d8643d6ddfae32ef1de3d116fe59a155890bef22fff0cf2ff57e122cfa123"} Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.933239 4687 scope.go:117] "RemoveContainer" containerID="4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.933335 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-69x47" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.935752 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qtbgc" event={"ID":"c15f16ef-addd-4cba-b2c3-69b4691fa2c7","Type":"ContainerStarted","Data":"0f12da81afef44df85ff62bda637fbd86a692f18d533c7242143952e77f69a3b"} Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.951687 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-69x47"] Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.952475 4687 scope.go:117] "RemoveContainer" containerID="4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81" Feb 28 09:16:36 crc kubenswrapper[4687]: E0228 09:16:36.952938 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81\": container with ID starting with 4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81 not found: ID does not exist" containerID="4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.952984 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81"} err="failed to get container status \"4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81\": rpc error: code = NotFound desc = could not find container \"4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81\": container with ID starting with 4b70f69c34968a1a63959d124f8a1c19b6c264d6ff57498ae6dcc7fb1f1ceb81 not found: ID does not exist" Feb 28 09:16:36 crc kubenswrapper[4687]: I0228 09:16:36.956909 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-69x47"] Feb 28 09:16:37 crc kubenswrapper[4687]: I0228 09:16:37.502657 4687 scope.go:117] "RemoveContainer" containerID="b6499e19e98bfef81b679942402139ecdcad9fa0f60e1cabdc12729e3c1393c6" Feb 28 09:16:37 crc kubenswrapper[4687]: I0228 09:16:37.952803 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qtbgc" event={"ID":"c15f16ef-addd-4cba-b2c3-69b4691fa2c7","Type":"ContainerStarted","Data":"8595d8cdccbb11be309251ec5e4df046b9f77836cc1f06bc46248a347ef18d4b"} Feb 28 09:16:37 crc kubenswrapper[4687]: I0228 09:16:37.971663 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qtbgc" podStartSLOduration=1.473859392 podStartE2EDuration="1.971644806s" podCreationTimestamp="2026-02-28 09:16:36 +0000 UTC" firstStartedPulling="2026-02-28 09:16:36.737763966 +0000 UTC m=+788.428333303" lastFinishedPulling="2026-02-28 09:16:37.23554938 +0000 UTC m=+788.926118717" observedRunningTime="2026-02-28 09:16:37.966294001 +0000 UTC m=+789.656863338" watchObservedRunningTime="2026-02-28 09:16:37.971644806 +0000 UTC m=+789.662214143" Feb 28 09:16:38 crc kubenswrapper[4687]: I0228 09:16:38.228947 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-qxhmg" Feb 28 09:16:38 crc kubenswrapper[4687]: I0228 09:16:38.234386 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lnfnj" Feb 28 09:16:38 crc kubenswrapper[4687]: I0228 09:16:38.662787 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" path="/var/lib/kubelet/pods/5ad66e91-ca38-4a4f-9f4e-286cdfd930e9/volumes" Feb 28 09:16:46 crc kubenswrapper[4687]: I0228 09:16:46.358994 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:46 crc kubenswrapper[4687]: I0228 09:16:46.359332 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:46 crc kubenswrapper[4687]: I0228 09:16:46.380937 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:47 crc kubenswrapper[4687]: I0228 09:16:47.024988 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qtbgc" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.260416 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94rf6"] Feb 28 09:16:51 crc kubenswrapper[4687]: E0228 09:16:51.261844 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" containerName="registry-server" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.261933 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" containerName="registry-server" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.262109 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad66e91-ca38-4a4f-9f4e-286cdfd930e9" containerName="registry-server" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.262931 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.268197 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94rf6"] Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.409356 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm2s\" (UniqueName: \"kubernetes.io/projected/e62f616c-2ff6-4543-9e65-45a54a7b4829-kube-api-access-7wm2s\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.409412 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-utilities\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.409467 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-catalog-content\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.510584 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-utilities\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.510697 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-catalog-content\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.510750 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm2s\" (UniqueName: \"kubernetes.io/projected/e62f616c-2ff6-4543-9e65-45a54a7b4829-kube-api-access-7wm2s\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.511103 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-utilities\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.511231 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-catalog-content\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.538166 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm2s\" (UniqueName: \"kubernetes.io/projected/e62f616c-2ff6-4543-9e65-45a54a7b4829-kube-api-access-7wm2s\") pod \"community-operators-94rf6\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.581256 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:16:51 crc kubenswrapper[4687]: I0228 09:16:51.971645 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94rf6"] Feb 28 09:16:51 crc kubenswrapper[4687]: W0228 09:16:51.975499 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62f616c_2ff6_4543_9e65_45a54a7b4829.slice/crio-8d2e2927b0b8a8142986e30d055a41c909b6f45a8e5e9a7c45224c533977742e WatchSource:0}: Error finding container 8d2e2927b0b8a8142986e30d055a41c909b6f45a8e5e9a7c45224c533977742e: Status 404 returned error can't find the container with id 8d2e2927b0b8a8142986e30d055a41c909b6f45a8e5e9a7c45224c533977742e Feb 28 09:16:52 crc kubenswrapper[4687]: I0228 09:16:52.037384 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94rf6" event={"ID":"e62f616c-2ff6-4543-9e65-45a54a7b4829","Type":"ContainerStarted","Data":"8d2e2927b0b8a8142986e30d055a41c909b6f45a8e5e9a7c45224c533977742e"} Feb 28 09:16:53 crc kubenswrapper[4687]: I0228 09:16:53.044834 4687 generic.go:334] "Generic (PLEG): container finished" podID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerID="de4dcf5a99beb98fdb0254a93750d3c6c2f6a25a0e861cad979517f8d5e70d30" exitCode=0 Feb 28 09:16:53 crc kubenswrapper[4687]: I0228 09:16:53.044940 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94rf6" event={"ID":"e62f616c-2ff6-4543-9e65-45a54a7b4829","Type":"ContainerDied","Data":"de4dcf5a99beb98fdb0254a93750d3c6c2f6a25a0e861cad979517f8d5e70d30"} Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.053336 4687 generic.go:334] "Generic (PLEG): container finished" podID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerID="90b5bd90e27caef7e4205f71c9d52235edc6d9b186be1e9edb9ab72b3146d8f1" exitCode=0 Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.053393 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94rf6" event={"ID":"e62f616c-2ff6-4543-9e65-45a54a7b4829","Type":"ContainerDied","Data":"90b5bd90e27caef7e4205f71c9d52235edc6d9b186be1e9edb9ab72b3146d8f1"} Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.080093 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf"] Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.081440 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.082949 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qdpch" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.087596 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf"] Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.143941 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.144291 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.144340 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjgk\" (UniqueName: \"kubernetes.io/projected/cc095223-5798-4cc2-a762-ca92a629167c-kube-api-access-vnjgk\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.245513 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.245714 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.245757 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjgk\" (UniqueName: \"kubernetes.io/projected/cc095223-5798-4cc2-a762-ca92a629167c-kube-api-access-vnjgk\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.246464 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.246981 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.264154 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjgk\" (UniqueName: \"kubernetes.io/projected/cc095223-5798-4cc2-a762-ca92a629167c-kube-api-access-vnjgk\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.394444 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.556996 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf"] Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.845410 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hxfkt"] Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.846965 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.856616 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxfkt"] Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.958403 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-catalog-content\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.958620 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-utilities\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:54 crc kubenswrapper[4687]: I0228 09:16:54.959127 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpgt\" (UniqueName: \"kubernetes.io/projected/a9d1114d-4832-4c47-bb6d-99364da4d736-kube-api-access-pnpgt\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.002518 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.002621 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.002698 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.003487 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2099836a5e3e90d046dbb8521988fee6933b3b356479c2ff7510ccbe5caaedf"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.003553 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://e2099836a5e3e90d046dbb8521988fee6933b3b356479c2ff7510ccbe5caaedf" gracePeriod=600 Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.066318 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-catalog-content\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.066385 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-utilities\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.066437 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpgt\" (UniqueName: \"kubernetes.io/projected/a9d1114d-4832-4c47-bb6d-99364da4d736-kube-api-access-pnpgt\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.067502 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-catalog-content\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.073560 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-utilities\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.076833 4687 generic.go:334] "Generic (PLEG): container finished" podID="cc095223-5798-4cc2-a762-ca92a629167c" containerID="ad6e513d136bf3090bd4008c4d882ff23e6943e595e245d7aa1035f425aee062" exitCode=0 Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.076865 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" event={"ID":"cc095223-5798-4cc2-a762-ca92a629167c","Type":"ContainerDied","Data":"ad6e513d136bf3090bd4008c4d882ff23e6943e595e245d7aa1035f425aee062"} Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.076944 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" event={"ID":"cc095223-5798-4cc2-a762-ca92a629167c","Type":"ContainerStarted","Data":"f2eec7c1064f9b57798015e1751447672ab5f92e90304145aa78adfbbfe33c3c"} Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.078947 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94rf6" event={"ID":"e62f616c-2ff6-4543-9e65-45a54a7b4829","Type":"ContainerStarted","Data":"10fcd97f0ea5c919a0f8ff8b1dabd3feebd87c51b08a404fc56c834897b4253a"} Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.090526 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpgt\" (UniqueName: \"kubernetes.io/projected/a9d1114d-4832-4c47-bb6d-99364da4d736-kube-api-access-pnpgt\") pod \"redhat-marketplace-hxfkt\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.117317 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94rf6" podStartSLOduration=2.655881552 podStartE2EDuration="4.117301324s" podCreationTimestamp="2026-02-28 09:16:51 +0000 UTC" firstStartedPulling="2026-02-28 09:16:53.046849487 +0000 UTC m=+804.737418825" lastFinishedPulling="2026-02-28 09:16:54.508269259 +0000 UTC m=+806.198838597" observedRunningTime="2026-02-28 09:16:55.1164542 +0000 UTC m=+806.807023538" watchObservedRunningTime="2026-02-28 09:16:55.117301324 +0000 UTC m=+806.807870661" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.165663 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:16:55 crc kubenswrapper[4687]: I0228 09:16:55.334941 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxfkt"] Feb 28 09:16:55 crc kubenswrapper[4687]: W0228 09:16:55.343094 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d1114d_4832_4c47_bb6d_99364da4d736.slice/crio-c23cf7eb9500b87703f3140afd57c9316c01fe61410b0a54ad2889d1d462ae49 WatchSource:0}: Error finding container c23cf7eb9500b87703f3140afd57c9316c01fe61410b0a54ad2889d1d462ae49: Status 404 returned error can't find the container with id c23cf7eb9500b87703f3140afd57c9316c01fe61410b0a54ad2889d1d462ae49 Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.086962 4687 generic.go:334] "Generic (PLEG): container finished" podID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerID="0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953" exitCode=0 Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.087004 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxfkt" event={"ID":"a9d1114d-4832-4c47-bb6d-99364da4d736","Type":"ContainerDied","Data":"0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953"} Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.087527 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxfkt" event={"ID":"a9d1114d-4832-4c47-bb6d-99364da4d736","Type":"ContainerStarted","Data":"c23cf7eb9500b87703f3140afd57c9316c01fe61410b0a54ad2889d1d462ae49"} Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.089832 4687 generic.go:334] "Generic (PLEG): container finished" podID="cc095223-5798-4cc2-a762-ca92a629167c" containerID="421dd205cffc99741f2a4ae0e10761bbb291d5e3c9e2292f2143a9a6ab0747c1" exitCode=0 Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.089942 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" event={"ID":"cc095223-5798-4cc2-a762-ca92a629167c","Type":"ContainerDied","Data":"421dd205cffc99741f2a4ae0e10761bbb291d5e3c9e2292f2143a9a6ab0747c1"} Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.092873 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="e2099836a5e3e90d046dbb8521988fee6933b3b356479c2ff7510ccbe5caaedf" exitCode=0 Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.092927 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"e2099836a5e3e90d046dbb8521988fee6933b3b356479c2ff7510ccbe5caaedf"} Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.092976 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"f16534f65e44ed5dcb5a741301bfadba47516c592259f18b72f5912611ebb09f"} Feb 28 09:16:56 crc kubenswrapper[4687]: I0228 09:16:56.092994 4687 scope.go:117] "RemoveContainer" containerID="bcbde49ebdbfb08d03f55668dbe45e77e9c15c2d2f6e5cdcc206fabca01051bf" Feb 28 09:16:57 crc kubenswrapper[4687]: I0228 09:16:57.105785 4687 generic.go:334] "Generic (PLEG): container finished" podID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerID="bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612" exitCode=0 Feb 28 09:16:57 crc kubenswrapper[4687]: I0228 09:16:57.105892 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxfkt" event={"ID":"a9d1114d-4832-4c47-bb6d-99364da4d736","Type":"ContainerDied","Data":"bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612"} Feb 28 09:16:57 crc kubenswrapper[4687]: I0228 09:16:57.110996 4687 generic.go:334] "Generic (PLEG): container finished" podID="cc095223-5798-4cc2-a762-ca92a629167c" containerID="6ec17f611f3d881516905f9a94986a1e0edc9b71929166820eee282ebd9b35fa" exitCode=0 Feb 28 09:16:57 crc kubenswrapper[4687]: I0228 09:16:57.111084 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" event={"ID":"cc095223-5798-4cc2-a762-ca92a629167c","Type":"ContainerDied","Data":"6ec17f611f3d881516905f9a94986a1e0edc9b71929166820eee282ebd9b35fa"} Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.119823 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxfkt" event={"ID":"a9d1114d-4832-4c47-bb6d-99364da4d736","Type":"ContainerStarted","Data":"96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba"} Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.134284 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hxfkt" podStartSLOduration=2.66815852 podStartE2EDuration="4.134265066s" podCreationTimestamp="2026-02-28 09:16:54 +0000 UTC" firstStartedPulling="2026-02-28 09:16:56.088655801 +0000 UTC m=+807.779225139" lastFinishedPulling="2026-02-28 09:16:57.554762348 +0000 UTC m=+809.245331685" observedRunningTime="2026-02-28 09:16:58.132452017 +0000 UTC m=+809.823021354" watchObservedRunningTime="2026-02-28 09:16:58.134265066 +0000 UTC m=+809.824834404" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.312312 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.511039 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-util\") pod \"cc095223-5798-4cc2-a762-ca92a629167c\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.511152 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-bundle\") pod \"cc095223-5798-4cc2-a762-ca92a629167c\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.511243 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnjgk\" (UniqueName: \"kubernetes.io/projected/cc095223-5798-4cc2-a762-ca92a629167c-kube-api-access-vnjgk\") pod \"cc095223-5798-4cc2-a762-ca92a629167c\" (UID: \"cc095223-5798-4cc2-a762-ca92a629167c\") " Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.511955 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-bundle" (OuterVolumeSpecName: "bundle") pod "cc095223-5798-4cc2-a762-ca92a629167c" (UID: "cc095223-5798-4cc2-a762-ca92a629167c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.518521 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc095223-5798-4cc2-a762-ca92a629167c-kube-api-access-vnjgk" (OuterVolumeSpecName: "kube-api-access-vnjgk") pod "cc095223-5798-4cc2-a762-ca92a629167c" (UID: "cc095223-5798-4cc2-a762-ca92a629167c"). InnerVolumeSpecName "kube-api-access-vnjgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.521206 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-util" (OuterVolumeSpecName: "util") pod "cc095223-5798-4cc2-a762-ca92a629167c" (UID: "cc095223-5798-4cc2-a762-ca92a629167c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.612842 4687 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.612875 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnjgk\" (UniqueName: \"kubernetes.io/projected/cc095223-5798-4cc2-a762-ca92a629167c-kube-api-access-vnjgk\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:58 crc kubenswrapper[4687]: I0228 09:16:58.612887 4687 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc095223-5798-4cc2-a762-ca92a629167c-util\") on node \"crc\" DevicePath \"\"" Feb 28 09:16:59 crc kubenswrapper[4687]: I0228 09:16:59.128324 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" event={"ID":"cc095223-5798-4cc2-a762-ca92a629167c","Type":"ContainerDied","Data":"f2eec7c1064f9b57798015e1751447672ab5f92e90304145aa78adfbbfe33c3c"} Feb 28 09:16:59 crc kubenswrapper[4687]: I0228 09:16:59.128748 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2eec7c1064f9b57798015e1751447672ab5f92e90304145aa78adfbbfe33c3c" Feb 28 09:16:59 crc kubenswrapper[4687]: I0228 09:16:59.128357 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.075588 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh"] Feb 28 09:17:01 crc kubenswrapper[4687]: E0228 09:17:01.076029 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="extract" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.076042 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="extract" Feb 28 09:17:01 crc kubenswrapper[4687]: E0228 09:17:01.076055 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="util" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.076061 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="util" Feb 28 09:17:01 crc kubenswrapper[4687]: E0228 09:17:01.076078 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="pull" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.076084 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="pull" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.076180 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc095223-5798-4cc2-a762-ca92a629167c" containerName="extract" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.076539 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.080072 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-bbd8w" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.098413 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh"] Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.248277 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgwd\" (UniqueName: \"kubernetes.io/projected/fff03855-1690-4745-825d-919a9f9469ea-kube-api-access-ktgwd\") pod \"openstack-operator-controller-init-595c94944c-4zqnh\" (UID: \"fff03855-1690-4745-825d-919a9f9469ea\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.350403 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgwd\" (UniqueName: \"kubernetes.io/projected/fff03855-1690-4745-825d-919a9f9469ea-kube-api-access-ktgwd\") pod \"openstack-operator-controller-init-595c94944c-4zqnh\" (UID: \"fff03855-1690-4745-825d-919a9f9469ea\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.368391 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgwd\" (UniqueName: \"kubernetes.io/projected/fff03855-1690-4745-825d-919a9f9469ea-kube-api-access-ktgwd\") pod \"openstack-operator-controller-init-595c94944c-4zqnh\" (UID: \"fff03855-1690-4745-825d-919a9f9469ea\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.388587 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.581726 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.582045 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.617480 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.618206 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh"] Feb 28 09:17:01 crc kubenswrapper[4687]: I0228 09:17:01.633731 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:17:02 crc kubenswrapper[4687]: I0228 09:17:02.160368 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" event={"ID":"fff03855-1690-4745-825d-919a9f9469ea","Type":"ContainerStarted","Data":"8627a075ba2f465a30333dbb79830e50c6fdf52464a3efff525fce61e1aeeb59"} Feb 28 09:17:02 crc kubenswrapper[4687]: I0228 09:17:02.191900 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:17:04 crc kubenswrapper[4687]: I0228 09:17:04.041206 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94rf6"] Feb 28 09:17:04 crc kubenswrapper[4687]: I0228 09:17:04.173686 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94rf6" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="registry-server" containerID="cri-o://10fcd97f0ea5c919a0f8ff8b1dabd3feebd87c51b08a404fc56c834897b4253a" gracePeriod=2 Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.167155 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.167206 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.183354 4687 generic.go:334] "Generic (PLEG): container finished" podID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerID="10fcd97f0ea5c919a0f8ff8b1dabd3feebd87c51b08a404fc56c834897b4253a" exitCode=0 Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.183395 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94rf6" event={"ID":"e62f616c-2ff6-4543-9e65-45a54a7b4829","Type":"ContainerDied","Data":"10fcd97f0ea5c919a0f8ff8b1dabd3feebd87c51b08a404fc56c834897b4253a"} Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.199578 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.233504 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.489586 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.562834 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-catalog-content\") pod \"e62f616c-2ff6-4543-9e65-45a54a7b4829\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.562952 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wm2s\" (UniqueName: \"kubernetes.io/projected/e62f616c-2ff6-4543-9e65-45a54a7b4829-kube-api-access-7wm2s\") pod \"e62f616c-2ff6-4543-9e65-45a54a7b4829\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.562984 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-utilities\") pod \"e62f616c-2ff6-4543-9e65-45a54a7b4829\" (UID: \"e62f616c-2ff6-4543-9e65-45a54a7b4829\") " Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.564111 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-utilities" (OuterVolumeSpecName: "utilities") pod "e62f616c-2ff6-4543-9e65-45a54a7b4829" (UID: "e62f616c-2ff6-4543-9e65-45a54a7b4829"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.569395 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62f616c-2ff6-4543-9e65-45a54a7b4829-kube-api-access-7wm2s" (OuterVolumeSpecName: "kube-api-access-7wm2s") pod "e62f616c-2ff6-4543-9e65-45a54a7b4829" (UID: "e62f616c-2ff6-4543-9e65-45a54a7b4829"). InnerVolumeSpecName "kube-api-access-7wm2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.605256 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e62f616c-2ff6-4543-9e65-45a54a7b4829" (UID: "e62f616c-2ff6-4543-9e65-45a54a7b4829"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.665316 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.665356 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wm2s\" (UniqueName: \"kubernetes.io/projected/e62f616c-2ff6-4543-9e65-45a54a7b4829-kube-api-access-7wm2s\") on node \"crc\" DevicePath \"\"" Feb 28 09:17:05 crc kubenswrapper[4687]: I0228 09:17:05.665370 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62f616c-2ff6-4543-9e65-45a54a7b4829-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.191108 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94rf6" event={"ID":"e62f616c-2ff6-4543-9e65-45a54a7b4829","Type":"ContainerDied","Data":"8d2e2927b0b8a8142986e30d055a41c909b6f45a8e5e9a7c45224c533977742e"} Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.191175 4687 scope.go:117] "RemoveContainer" containerID="10fcd97f0ea5c919a0f8ff8b1dabd3feebd87c51b08a404fc56c834897b4253a" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.191285 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94rf6" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.196174 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" event={"ID":"fff03855-1690-4745-825d-919a9f9469ea","Type":"ContainerStarted","Data":"113832ae06a1bc2c4252c2fff587b3a164a23eba2930e26dfc0cb9885f816b85"} Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.196576 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.208557 4687 scope.go:117] "RemoveContainer" containerID="90b5bd90e27caef7e4205f71c9d52235edc6d9b186be1e9edb9ab72b3146d8f1" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.222806 4687 scope.go:117] "RemoveContainer" containerID="de4dcf5a99beb98fdb0254a93750d3c6c2f6a25a0e861cad979517f8d5e70d30" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.228250 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" podStartSLOduration=1.5507894709999999 podStartE2EDuration="5.228234847s" podCreationTimestamp="2026-02-28 09:17:01 +0000 UTC" firstStartedPulling="2026-02-28 09:17:01.633510812 +0000 UTC m=+813.324080150" lastFinishedPulling="2026-02-28 09:17:05.310956189 +0000 UTC m=+817.001525526" observedRunningTime="2026-02-28 09:17:06.224709476 +0000 UTC m=+817.915278813" watchObservedRunningTime="2026-02-28 09:17:06.228234847 +0000 UTC m=+817.918804185" Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.238951 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94rf6"] Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.244143 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94rf6"] Feb 28 09:17:06 crc kubenswrapper[4687]: I0228 09:17:06.663809 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" path="/var/lib/kubelet/pods/e62f616c-2ff6-4543-9e65-45a54a7b4829/volumes" Feb 28 09:17:08 crc kubenswrapper[4687]: I0228 09:17:08.637736 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxfkt"] Feb 28 09:17:08 crc kubenswrapper[4687]: I0228 09:17:08.638032 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hxfkt" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="registry-server" containerID="cri-o://96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba" gracePeriod=2 Feb 28 09:17:08 crc kubenswrapper[4687]: I0228 09:17:08.976920 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.010663 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-utilities\") pod \"a9d1114d-4832-4c47-bb6d-99364da4d736\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.010712 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-catalog-content\") pod \"a9d1114d-4832-4c47-bb6d-99364da4d736\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.010793 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnpgt\" (UniqueName: \"kubernetes.io/projected/a9d1114d-4832-4c47-bb6d-99364da4d736-kube-api-access-pnpgt\") pod \"a9d1114d-4832-4c47-bb6d-99364da4d736\" (UID: \"a9d1114d-4832-4c47-bb6d-99364da4d736\") " Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.011457 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-utilities" (OuterVolumeSpecName: "utilities") pod "a9d1114d-4832-4c47-bb6d-99364da4d736" (UID: "a9d1114d-4832-4c47-bb6d-99364da4d736"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.015800 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d1114d-4832-4c47-bb6d-99364da4d736-kube-api-access-pnpgt" (OuterVolumeSpecName: "kube-api-access-pnpgt") pod "a9d1114d-4832-4c47-bb6d-99364da4d736" (UID: "a9d1114d-4832-4c47-bb6d-99364da4d736"). InnerVolumeSpecName "kube-api-access-pnpgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.032426 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9d1114d-4832-4c47-bb6d-99364da4d736" (UID: "a9d1114d-4832-4c47-bb6d-99364da4d736"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.111927 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.111961 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d1114d-4832-4c47-bb6d-99364da4d736-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.111976 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnpgt\" (UniqueName: \"kubernetes.io/projected/a9d1114d-4832-4c47-bb6d-99364da4d736-kube-api-access-pnpgt\") on node \"crc\" DevicePath \"\"" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.219274 4687 generic.go:334] "Generic (PLEG): container finished" podID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerID="96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba" exitCode=0 Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.219324 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxfkt" event={"ID":"a9d1114d-4832-4c47-bb6d-99364da4d736","Type":"ContainerDied","Data":"96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba"} Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.219341 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxfkt" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.219352 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxfkt" event={"ID":"a9d1114d-4832-4c47-bb6d-99364da4d736","Type":"ContainerDied","Data":"c23cf7eb9500b87703f3140afd57c9316c01fe61410b0a54ad2889d1d462ae49"} Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.219369 4687 scope.go:117] "RemoveContainer" containerID="96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.234847 4687 scope.go:117] "RemoveContainer" containerID="bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.246422 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxfkt"] Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.250803 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxfkt"] Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.262672 4687 scope.go:117] "RemoveContainer" containerID="0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.274625 4687 scope.go:117] "RemoveContainer" containerID="96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba" Feb 28 09:17:09 crc kubenswrapper[4687]: E0228 09:17:09.274878 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba\": container with ID starting with 96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba not found: ID does not exist" containerID="96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.274913 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba"} err="failed to get container status \"96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba\": rpc error: code = NotFound desc = could not find container \"96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba\": container with ID starting with 96dd661becd38241a6de54208e8b01c65df772536a7ff4929fbb077634afdcba not found: ID does not exist" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.274936 4687 scope.go:117] "RemoveContainer" containerID="bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612" Feb 28 09:17:09 crc kubenswrapper[4687]: E0228 09:17:09.275187 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612\": container with ID starting with bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612 not found: ID does not exist" containerID="bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.275216 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612"} err="failed to get container status \"bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612\": rpc error: code = NotFound desc = could not find container \"bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612\": container with ID starting with bb00eedb68448155489798404084be5135d0ba7a24816aca44222518cfeee612 not found: ID does not exist" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.275239 4687 scope.go:117] "RemoveContainer" containerID="0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953" Feb 28 09:17:09 crc kubenswrapper[4687]: E0228 09:17:09.275463 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953\": container with ID starting with 0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953 not found: ID does not exist" containerID="0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953" Feb 28 09:17:09 crc kubenswrapper[4687]: I0228 09:17:09.275486 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953"} err="failed to get container status \"0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953\": rpc error: code = NotFound desc = could not find container \"0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953\": container with ID starting with 0fad0fbc9859e67d80472d6e8a3deaf7a7b96e515815102ceea6ba5f2dce4953 not found: ID does not exist" Feb 28 09:17:10 crc kubenswrapper[4687]: I0228 09:17:10.664292 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" path="/var/lib/kubelet/pods/a9d1114d-4832-4c47-bb6d-99364da4d736/volumes" Feb 28 09:17:11 crc kubenswrapper[4687]: I0228 09:17:11.391599 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-595c94944c-4zqnh" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.522830 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl"] Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.523649 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="extract-utilities" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523663 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="extract-utilities" Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.523679 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="registry-server" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523684 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="registry-server" Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.523691 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="extract-content" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523697 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="extract-content" Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.523709 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="extract-content" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523715 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="extract-content" Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.523722 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="registry-server" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523729 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="registry-server" Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.523739 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="extract-utilities" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523744 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="extract-utilities" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523851 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d1114d-4832-4c47-bb6d-99364da4d736" containerName="registry-server" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.523860 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62f616c-2ff6-4543-9e65-45a54a7b4829" containerName="registry-server" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.524359 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:30 crc kubenswrapper[4687]: W0228 09:17:30.526012 4687 reflector.go:561] object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-724kt": failed to list *v1.Secret: secrets "cinder-operator-controller-manager-dockercfg-724kt" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.526079 4687 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"cinder-operator-controller-manager-dockercfg-724kt\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cinder-operator-controller-manager-dockercfg-724kt\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.526223 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.526959 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.529322 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mkhkr" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.534645 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.535206 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.536799 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-l9s74" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.548281 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.558633 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.563052 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.574455 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.575318 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.583688 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4lrrv" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.613748 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.620682 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.627911 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.635475 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xkn84" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.673123 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.684442 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.685294 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.689490 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.690321 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.690780 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9jpz5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.692415 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-6zfzw" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.692555 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.692724 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.693323 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.695518 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-4b8t5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.697196 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.701052 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.707907 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.711400 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.712185 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.713214 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcl7k\" (UniqueName: \"kubernetes.io/projected/dc30956e-12c6-4973-a99f-ae4b502abb17-kube-api-access-pcl7k\") pod \"barbican-operator-controller-manager-6db6876945-jtdtt\" (UID: \"dc30956e-12c6-4973-a99f-ae4b502abb17\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.713264 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnnwz\" (UniqueName: \"kubernetes.io/projected/40ae4140-3768-425a-9791-234afb6297fe-kube-api-access-mnnwz\") pod \"cinder-operator-controller-manager-55d77d7b5c-chfpl\" (UID: \"40ae4140-3768-425a-9791-234afb6297fe\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.713345 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwnm\" (UniqueName: \"kubernetes.io/projected/30b87ec4-ee50-402d-8afc-a3f9241bbc4c-kube-api-access-fcwnm\") pod \"glance-operator-controller-manager-64db6967f8-9zkzk\" (UID: \"30b87ec4-ee50-402d-8afc-a3f9241bbc4c\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.713365 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdg4r\" (UniqueName: \"kubernetes.io/projected/c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1-kube-api-access-pdg4r\") pod \"designate-operator-controller-manager-5d87c9d997-7wrs7\" (UID: \"c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.713520 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.713868 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rtnn5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.732172 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.733068 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.735360 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rxc2l" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.739767 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.740379 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.744507 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bcmmb" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.761239 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.762122 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.764078 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fc84x" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.780372 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.784794 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.788741 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.793118 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.794402 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.796733 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rbvm8" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.797009 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.798108 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.800244 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dlnwx" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.804144 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.812231 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816056 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrz5\" (UniqueName: \"kubernetes.io/projected/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-kube-api-access-zmrz5\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816223 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrfs\" (UniqueName: \"kubernetes.io/projected/5945c472-0f03-4666-84ca-b8f4545db411-kube-api-access-fqrfs\") pod \"heat-operator-controller-manager-cf99c678f-ltpvl\" (UID: \"5945c472-0f03-4666-84ca-b8f4545db411\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816257 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqfk\" (UniqueName: \"kubernetes.io/projected/72be3389-d521-4742-9081-8bdc3aef0dc6-kube-api-access-tvqfk\") pod \"nova-operator-controller-manager-74b6b5dc96-kdxq5\" (UID: \"72be3389-d521-4742-9081-8bdc3aef0dc6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816379 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcl7k\" (UniqueName: \"kubernetes.io/projected/dc30956e-12c6-4973-a99f-ae4b502abb17-kube-api-access-pcl7k\") pod \"barbican-operator-controller-manager-6db6876945-jtdtt\" (UID: \"dc30956e-12c6-4973-a99f-ae4b502abb17\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816402 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fvd\" (UniqueName: \"kubernetes.io/projected/134bd541-e4b0-4e84-b85d-a50c413d6cd2-kube-api-access-r9fvd\") pod \"octavia-operator-controller-manager-5d86c7ddb7-dsfvj\" (UID: \"134bd541-e4b0-4e84-b85d-a50c413d6cd2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816520 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816555 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnnwz\" (UniqueName: \"kubernetes.io/projected/40ae4140-3768-425a-9791-234afb6297fe-kube-api-access-mnnwz\") pod \"cinder-operator-controller-manager-55d77d7b5c-chfpl\" (UID: \"40ae4140-3768-425a-9791-234afb6297fe\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816673 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dd5x\" (UniqueName: \"kubernetes.io/projected/a2ca8c5d-3391-4ae4-a451-8a14fe2352aa-kube-api-access-2dd5x\") pod \"mariadb-operator-controller-manager-7b6bfb6475-jbzlm\" (UID: \"a2ca8c5d-3391-4ae4-a451-8a14fe2352aa\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816712 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4wbt\" (UniqueName: \"kubernetes.io/projected/f5b51009-d199-4b88-9158-1b7b3b1848d3-kube-api-access-r4wbt\") pod \"ironic-operator-controller-manager-545456dc4-9nm28\" (UID: \"f5b51009-d199-4b88-9158-1b7b3b1848d3\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816834 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9t9s\" (UniqueName: \"kubernetes.io/projected/0e2af601-594d-47f7-95ef-0474051dae27-kube-api-access-h9t9s\") pod \"horizon-operator-controller-manager-78bc7f9bd9-v9vbd\" (UID: \"0e2af601-594d-47f7-95ef-0474051dae27\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816864 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bwf\" (UniqueName: \"kubernetes.io/projected/09ff8e79-084a-4043-9061-c7007b041e86-kube-api-access-g9bwf\") pod \"neutron-operator-controller-manager-54688575f-hsvs9\" (UID: \"09ff8e79-084a-4043-9061-c7007b041e86\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.816976 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckz2\" (UniqueName: \"kubernetes.io/projected/89b24774-f0eb-4d63-a124-1b244f195163-kube-api-access-mckz2\") pod \"manila-operator-controller-manager-67d996989d-jw6hs\" (UID: \"89b24774-f0eb-4d63-a124-1b244f195163\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.817001 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwnm\" (UniqueName: \"kubernetes.io/projected/30b87ec4-ee50-402d-8afc-a3f9241bbc4c-kube-api-access-fcwnm\") pod \"glance-operator-controller-manager-64db6967f8-9zkzk\" (UID: \"30b87ec4-ee50-402d-8afc-a3f9241bbc4c\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.817119 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdg4r\" (UniqueName: \"kubernetes.io/projected/c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1-kube-api-access-pdg4r\") pod \"designate-operator-controller-manager-5d87c9d997-7wrs7\" (UID: \"c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.817161 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swknj\" (UniqueName: \"kubernetes.io/projected/14725449-2193-4b84-b736-31c04f9f43e4-kube-api-access-swknj\") pod \"keystone-operator-controller-manager-7c789f89c6-8r8kv\" (UID: \"14725449-2193-4b84-b736-31c04f9f43e4\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.819553 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.820372 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.827376 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.827400 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-mbx5f" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.831999 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.832908 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.836316 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vk8dc" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.838294 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnnwz\" (UniqueName: \"kubernetes.io/projected/40ae4140-3768-425a-9791-234afb6297fe-kube-api-access-mnnwz\") pod \"cinder-operator-controller-manager-55d77d7b5c-chfpl\" (UID: \"40ae4140-3768-425a-9791-234afb6297fe\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.838515 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.839887 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcl7k\" (UniqueName: \"kubernetes.io/projected/dc30956e-12c6-4973-a99f-ae4b502abb17-kube-api-access-pcl7k\") pod \"barbican-operator-controller-manager-6db6876945-jtdtt\" (UID: \"dc30956e-12c6-4973-a99f-ae4b502abb17\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.842256 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.843237 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwnm\" (UniqueName: \"kubernetes.io/projected/30b87ec4-ee50-402d-8afc-a3f9241bbc4c-kube-api-access-fcwnm\") pod \"glance-operator-controller-manager-64db6967f8-9zkzk\" (UID: \"30b87ec4-ee50-402d-8afc-a3f9241bbc4c\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.855044 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.856073 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.857698 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-pb88x" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.860946 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.861781 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdg4r\" (UniqueName: \"kubernetes.io/projected/c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1-kube-api-access-pdg4r\") pod \"designate-operator-controller-manager-5d87c9d997-7wrs7\" (UID: \"c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.861880 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.872769 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.894962 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918183 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dd5x\" (UniqueName: \"kubernetes.io/projected/a2ca8c5d-3391-4ae4-a451-8a14fe2352aa-kube-api-access-2dd5x\") pod \"mariadb-operator-controller-manager-7b6bfb6475-jbzlm\" (UID: \"a2ca8c5d-3391-4ae4-a451-8a14fe2352aa\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918240 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4wbt\" (UniqueName: \"kubernetes.io/projected/f5b51009-d199-4b88-9158-1b7b3b1848d3-kube-api-access-r4wbt\") pod \"ironic-operator-controller-manager-545456dc4-9nm28\" (UID: \"f5b51009-d199-4b88-9158-1b7b3b1848d3\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918264 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9t9s\" (UniqueName: \"kubernetes.io/projected/0e2af601-594d-47f7-95ef-0474051dae27-kube-api-access-h9t9s\") pod \"horizon-operator-controller-manager-78bc7f9bd9-v9vbd\" (UID: \"0e2af601-594d-47f7-95ef-0474051dae27\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918287 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bwf\" (UniqueName: \"kubernetes.io/projected/09ff8e79-084a-4043-9061-c7007b041e86-kube-api-access-g9bwf\") pod \"neutron-operator-controller-manager-54688575f-hsvs9\" (UID: \"09ff8e79-084a-4043-9061-c7007b041e86\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918308 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckz2\" (UniqueName: \"kubernetes.io/projected/89b24774-f0eb-4d63-a124-1b244f195163-kube-api-access-mckz2\") pod \"manila-operator-controller-manager-67d996989d-jw6hs\" (UID: \"89b24774-f0eb-4d63-a124-1b244f195163\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918332 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swknj\" (UniqueName: \"kubernetes.io/projected/14725449-2193-4b84-b736-31c04f9f43e4-kube-api-access-swknj\") pod \"keystone-operator-controller-manager-7c789f89c6-8r8kv\" (UID: \"14725449-2193-4b84-b736-31c04f9f43e4\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918374 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7p4n\" (UniqueName: \"kubernetes.io/projected/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-kube-api-access-t7p4n\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918395 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrz5\" (UniqueName: \"kubernetes.io/projected/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-kube-api-access-zmrz5\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918422 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87ls5\" (UniqueName: \"kubernetes.io/projected/41e8cac0-417a-4c1d-a31c-0389bdebd0ba-kube-api-access-87ls5\") pod \"ovn-operator-controller-manager-75684d597f-9fpjj\" (UID: \"41e8cac0-417a-4c1d-a31c-0389bdebd0ba\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918443 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vwj\" (UniqueName: \"kubernetes.io/projected/7f019778-ba45-4e4a-a6d8-dd6d056aed3b-kube-api-access-26vwj\") pod \"placement-operator-controller-manager-648564c9fc-jht6f\" (UID: \"7f019778-ba45-4e4a-a6d8-dd6d056aed3b\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918462 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918480 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrfs\" (UniqueName: \"kubernetes.io/projected/5945c472-0f03-4666-84ca-b8f4545db411-kube-api-access-fqrfs\") pod \"heat-operator-controller-manager-cf99c678f-ltpvl\" (UID: \"5945c472-0f03-4666-84ca-b8f4545db411\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.918503 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqfk\" (UniqueName: \"kubernetes.io/projected/72be3389-d521-4742-9081-8bdc3aef0dc6-kube-api-access-tvqfk\") pod \"nova-operator-controller-manager-74b6b5dc96-kdxq5\" (UID: \"72be3389-d521-4742-9081-8bdc3aef0dc6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.921705 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fvd\" (UniqueName: \"kubernetes.io/projected/134bd541-e4b0-4e84-b85d-a50c413d6cd2-kube-api-access-r9fvd\") pod \"octavia-operator-controller-manager-5d86c7ddb7-dsfvj\" (UID: \"134bd541-e4b0-4e84-b85d-a50c413d6cd2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.921747 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.922260 4687 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:30 crc kubenswrapper[4687]: E0228 09:17:30.922332 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert podName:caa33de5-0fe2-4930-bf89-0f8ad6a96ca2 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:31.42230973 +0000 UTC m=+843.112879067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert") pod "infra-operator-controller-manager-f7fcc58b9-vqdm7" (UID: "caa33de5-0fe2-4930-bf89-0f8ad6a96ca2") : secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.940554 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dd5x\" (UniqueName: \"kubernetes.io/projected/a2ca8c5d-3391-4ae4-a451-8a14fe2352aa-kube-api-access-2dd5x\") pod \"mariadb-operator-controller-manager-7b6bfb6475-jbzlm\" (UID: \"a2ca8c5d-3391-4ae4-a451-8a14fe2352aa\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.940595 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9t9s\" (UniqueName: \"kubernetes.io/projected/0e2af601-594d-47f7-95ef-0474051dae27-kube-api-access-h9t9s\") pod \"horizon-operator-controller-manager-78bc7f9bd9-v9vbd\" (UID: \"0e2af601-594d-47f7-95ef-0474051dae27\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.940653 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.940865 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bwf\" (UniqueName: \"kubernetes.io/projected/09ff8e79-084a-4043-9061-c7007b041e86-kube-api-access-g9bwf\") pod \"neutron-operator-controller-manager-54688575f-hsvs9\" (UID: \"09ff8e79-084a-4043-9061-c7007b041e86\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.941433 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swknj\" (UniqueName: \"kubernetes.io/projected/14725449-2193-4b84-b736-31c04f9f43e4-kube-api-access-swknj\") pod \"keystone-operator-controller-manager-7c789f89c6-8r8kv\" (UID: \"14725449-2193-4b84-b736-31c04f9f43e4\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.942343 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.943370 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckz2\" (UniqueName: \"kubernetes.io/projected/89b24774-f0eb-4d63-a124-1b244f195163-kube-api-access-mckz2\") pod \"manila-operator-controller-manager-67d996989d-jw6hs\" (UID: \"89b24774-f0eb-4d63-a124-1b244f195163\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.943673 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ff5dd" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.944107 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrfs\" (UniqueName: \"kubernetes.io/projected/5945c472-0f03-4666-84ca-b8f4545db411-kube-api-access-fqrfs\") pod \"heat-operator-controller-manager-cf99c678f-ltpvl\" (UID: \"5945c472-0f03-4666-84ca-b8f4545db411\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.944892 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fvd\" (UniqueName: \"kubernetes.io/projected/134bd541-e4b0-4e84-b85d-a50c413d6cd2-kube-api-access-r9fvd\") pod \"octavia-operator-controller-manager-5d86c7ddb7-dsfvj\" (UID: \"134bd541-e4b0-4e84-b85d-a50c413d6cd2\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.945000 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.946151 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4wbt\" (UniqueName: \"kubernetes.io/projected/f5b51009-d199-4b88-9158-1b7b3b1848d3-kube-api-access-r4wbt\") pod \"ironic-operator-controller-manager-545456dc4-9nm28\" (UID: \"f5b51009-d199-4b88-9158-1b7b3b1848d3\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.948057 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqfk\" (UniqueName: \"kubernetes.io/projected/72be3389-d521-4742-9081-8bdc3aef0dc6-kube-api-access-tvqfk\") pod \"nova-operator-controller-manager-74b6b5dc96-kdxq5\" (UID: \"72be3389-d521-4742-9081-8bdc3aef0dc6\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.950757 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrz5\" (UniqueName: \"kubernetes.io/projected/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-kube-api-access-zmrz5\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.957675 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.963360 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.964498 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.965962 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8"] Feb 28 09:17:30 crc kubenswrapper[4687]: I0228 09:17:30.977305 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-kg9sm" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.016504 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.017392 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.018377 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.024487 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7p4n\" (UniqueName: \"kubernetes.io/projected/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-kube-api-access-t7p4n\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.024535 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87ls5\" (UniqueName: \"kubernetes.io/projected/41e8cac0-417a-4c1d-a31c-0389bdebd0ba-kube-api-access-87ls5\") pod \"ovn-operator-controller-manager-75684d597f-9fpjj\" (UID: \"41e8cac0-417a-4c1d-a31c-0389bdebd0ba\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.024559 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vwj\" (UniqueName: \"kubernetes.io/projected/7f019778-ba45-4e4a-a6d8-dd6d056aed3b-kube-api-access-26vwj\") pod \"placement-operator-controller-manager-648564c9fc-jht6f\" (UID: \"7f019778-ba45-4e4a-a6d8-dd6d056aed3b\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.024582 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.024664 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dc9\" (UniqueName: \"kubernetes.io/projected/9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a-kube-api-access-s7dc9\") pod \"swift-operator-controller-manager-9b9ff9f4d-q5zdg\" (UID: \"9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.024689 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pjk\" (UniqueName: \"kubernetes.io/projected/5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf-kube-api-access-25pjk\") pod \"telemetry-operator-controller-manager-5fdb694969-fxqv8\" (UID: \"5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.026358 4687 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.026413 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert podName:e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe nodeName:}" failed. No retries permitted until 2026-02-28 09:17:31.526398927 +0000 UTC m=+843.216968264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" (UID: "e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.044874 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sq2w9" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.045642 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.047115 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.048623 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.049245 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7p4n\" (UniqueName: \"kubernetes.io/projected/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-kube-api-access-t7p4n\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.055815 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vwj\" (UniqueName: \"kubernetes.io/projected/7f019778-ba45-4e4a-a6d8-dd6d056aed3b-kube-api-access-26vwj\") pod \"placement-operator-controller-manager-648564c9fc-jht6f\" (UID: \"7f019778-ba45-4e4a-a6d8-dd6d056aed3b\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.055847 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87ls5\" (UniqueName: \"kubernetes.io/projected/41e8cac0-417a-4c1d-a31c-0389bdebd0ba-kube-api-access-87ls5\") pod \"ovn-operator-controller-manager-75684d597f-9fpjj\" (UID: \"41e8cac0-417a-4c1d-a31c-0389bdebd0ba\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.057330 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.064386 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.084478 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.109350 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.117392 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.125398 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dc9\" (UniqueName: \"kubernetes.io/projected/9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a-kube-api-access-s7dc9\") pod \"swift-operator-controller-manager-9b9ff9f4d-q5zdg\" (UID: \"9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.125443 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pjk\" (UniqueName: \"kubernetes.io/projected/5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf-kube-api-access-25pjk\") pod \"telemetry-operator-controller-manager-5fdb694969-fxqv8\" (UID: \"5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.125553 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzkl\" (UniqueName: \"kubernetes.io/projected/ccb38bca-46b2-4c3c-a6c5-d30af68435d1-kube-api-access-qrzkl\") pod \"test-operator-controller-manager-55b5ff4dbb-2t7hs\" (UID: \"ccb38bca-46b2-4c3c-a6c5-d30af68435d1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.143096 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.144161 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.156483 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pjk\" (UniqueName: \"kubernetes.io/projected/5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf-kube-api-access-25pjk\") pod \"telemetry-operator-controller-manager-5fdb694969-fxqv8\" (UID: \"5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.165203 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.168881 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gp5l2" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.174729 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dc9\" (UniqueName: \"kubernetes.io/projected/9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a-kube-api-access-s7dc9\") pod \"swift-operator-controller-manager-9b9ff9f4d-q5zdg\" (UID: \"9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.228716 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xggh\" (UniqueName: \"kubernetes.io/projected/3ebd35dc-7a29-4c3f-b442-bfe29d833f06-kube-api-access-7xggh\") pod \"watcher-operator-controller-manager-bccc79885-c92d5\" (UID: \"3ebd35dc-7a29-4c3f-b442-bfe29d833f06\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.229888 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzkl\" (UniqueName: \"kubernetes.io/projected/ccb38bca-46b2-4c3c-a6c5-d30af68435d1-kube-api-access-qrzkl\") pod \"test-operator-controller-manager-55b5ff4dbb-2t7hs\" (UID: \"ccb38bca-46b2-4c3c-a6c5-d30af68435d1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.255250 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.256042 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzkl\" (UniqueName: \"kubernetes.io/projected/ccb38bca-46b2-4c3c-a6c5-d30af68435d1-kube-api-access-qrzkl\") pod \"test-operator-controller-manager-55b5ff4dbb-2t7hs\" (UID: \"ccb38bca-46b2-4c3c-a6c5-d30af68435d1\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.256127 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.260786 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zz292" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.260951 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.262460 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.268535 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.281404 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.293808 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.304225 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.323283 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.332581 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.332629 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.332683 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xggh\" (UniqueName: \"kubernetes.io/projected/3ebd35dc-7a29-4c3f-b442-bfe29d833f06-kube-api-access-7xggh\") pod \"watcher-operator-controller-manager-bccc79885-c92d5\" (UID: \"3ebd35dc-7a29-4c3f-b442-bfe29d833f06\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.332715 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxdr4\" (UniqueName: \"kubernetes.io/projected/005ef854-8015-4724-b7b1-42f8fe9a1497-kube-api-access-fxdr4\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.357506 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xggh\" (UniqueName: \"kubernetes.io/projected/3ebd35dc-7a29-4c3f-b442-bfe29d833f06-kube-api-access-7xggh\") pod \"watcher-operator-controller-manager-bccc79885-c92d5\" (UID: \"3ebd35dc-7a29-4c3f-b442-bfe29d833f06\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.365329 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.366126 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.369614 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.373593 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-564ls" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.378726 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.434315 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.435177 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.435225 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxdr4\" (UniqueName: \"kubernetes.io/projected/005ef854-8015-4724-b7b1-42f8fe9a1497-kube-api-access-fxdr4\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.435396 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kch2\" (UniqueName: \"kubernetes.io/projected/da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4-kube-api-access-9kch2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p64nn\" (UID: \"da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.435462 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.435653 4687 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.435708 4687 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.435733 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:31.935716339 +0000 UTC m=+843.626285676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "metrics-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.435778 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert podName:caa33de5-0fe2-4930-bf89-0f8ad6a96ca2 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:32.435758408 +0000 UTC m=+844.126327746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert") pod "infra-operator-controller-manager-f7fcc58b9-vqdm7" (UID: "caa33de5-0fe2-4930-bf89-0f8ad6a96ca2") : secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.437188 4687 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.437264 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:31.93723208 +0000 UTC m=+843.627801417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.451805 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxdr4\" (UniqueName: \"kubernetes.io/projected/005ef854-8015-4724-b7b1-42f8fe9a1497-kube-api-access-fxdr4\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.476429 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-724kt" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.477070 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.512612 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.536482 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.536583 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kch2\" (UniqueName: \"kubernetes.io/projected/da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4-kube-api-access-9kch2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p64nn\" (UID: \"da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.536686 4687 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.536768 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert podName:e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe nodeName:}" failed. No retries permitted until 2026-02-28 09:17:32.536746531 +0000 UTC m=+844.227315869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" (UID: "e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.551795 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kch2\" (UniqueName: \"kubernetes.io/projected/da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4-kube-api-access-9kch2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p64nn\" (UID: \"da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.686637 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.720560 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.778723 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.814972 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7"] Feb 28 09:17:31 crc kubenswrapper[4687]: W0228 09:17:31.831903 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3d5a3fe_4e59_43c3_aef3_33c3e7830cb1.slice/crio-e454ddf80f8a464ed1be9afd5d2d2de26fc4af1a51364fd8eb234dff8c42d9e9 WatchSource:0}: Error finding container e454ddf80f8a464ed1be9afd5d2d2de26fc4af1a51364fd8eb234dff8c42d9e9: Status 404 returned error can't find the container with id e454ddf80f8a464ed1be9afd5d2d2de26fc4af1a51364fd8eb234dff8c42d9e9 Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.922899 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.927924 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl"] Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.931677 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm"] Feb 28 09:17:31 crc kubenswrapper[4687]: W0228 09:17:31.937701 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2ca8c5d_3391_4ae4_a451_8a14fe2352aa.slice/crio-3d363c416deaa69ba9c79c713eb69c94d076f7a39f0b97ed4e041d7fa07b9c3e WatchSource:0}: Error finding container 3d363c416deaa69ba9c79c713eb69c94d076f7a39f0b97ed4e041d7fa07b9c3e: Status 404 returned error can't find the container with id 3d363c416deaa69ba9c79c713eb69c94d076f7a39f0b97ed4e041d7fa07b9c3e Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.943319 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9"] Feb 28 09:17:31 crc kubenswrapper[4687]: W0228 09:17:31.950691 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ff8e79_084a_4043_9061_c7007b041e86.slice/crio-35b1b124c920bf15c9e307a2a542332002419938c1b3a390a5e5adadf7db1f12 WatchSource:0}: Error finding container 35b1b124c920bf15c9e307a2a542332002419938c1b3a390a5e5adadf7db1f12: Status 404 returned error can't find the container with id 35b1b124c920bf15c9e307a2a542332002419938c1b3a390a5e5adadf7db1f12 Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.955816 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: I0228 09:17:31.955867 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.956001 4687 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.956064 4687 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.956084 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:32.956068303 +0000 UTC m=+844.646637640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "metrics-server-cert" not found Feb 28 09:17:31 crc kubenswrapper[4687]: E0228 09:17:31.956118 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:32.956103438 +0000 UTC m=+844.646672776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "webhook-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.288959 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5"] Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.292572 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl"] Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.306981 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8"] Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.311836 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ebd35dc_7a29_4c3f_b442_bfe29d833f06.slice/crio-a6a142bdcb6b4a6d18c14a4c78f794779965fc314a792f1b2345217b612a66aa WatchSource:0}: Error finding container a6a142bdcb6b4a6d18c14a4c78f794779965fc314a792f1b2345217b612a66aa: Status 404 returned error can't find the container with id a6a142bdcb6b4a6d18c14a4c78f794779965fc314a792f1b2345217b612a66aa Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.315287 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb38bca_46b2_4c3c_a6c5_d30af68435d1.slice/crio-44610e94f3a9f4c83bea526c10d11a9eede3eb73cba60e3f3321de2fd2540729 WatchSource:0}: Error finding container 44610e94f3a9f4c83bea526c10d11a9eede3eb73cba60e3f3321de2fd2540729: Status 404 returned error can't find the container with id 44610e94f3a9f4c83bea526c10d11a9eede3eb73cba60e3f3321de2fd2540729 Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.317306 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs"] Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.326822 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg"] Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.334732 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj"] Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.337620 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj"] Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.338724 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7d6d86_afe8_4c99_8e5e_d81279cf5a9a.slice/crio-89df2b320e496f7fe5156a4f3b065341ba9de2f422271482d41f9d80b74c910d WatchSource:0}: Error finding container 89df2b320e496f7fe5156a4f3b065341ba9de2f422271482d41f9d80b74c910d: Status 404 returned error can't find the container with id 89df2b320e496f7fe5156a4f3b065341ba9de2f422271482d41f9d80b74c910d Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.344114 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs"] Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.346887 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5"] Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.347251 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b24774_f0eb_4d63_a124_1b244f195163.slice/crio-0bff21523307d7237a7c3ddbf222def90186526326b12bfc590b7e8cb212982a WatchSource:0}: Error finding container 0bff21523307d7237a7c3ddbf222def90186526326b12bfc590b7e8cb212982a: Status 404 returned error can't find the container with id 0bff21523307d7237a7c3ddbf222def90186526326b12bfc590b7e8cb212982a Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.349922 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28"] Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.350489 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod134bd541_e4b0_4e84_b85d_a50c413d6cd2.slice/crio-8f22d36be3aa596b8c959179bd27d0b5431bcddeb28c881198cf9b51b0dd3ff0 WatchSource:0}: Error finding container 8f22d36be3aa596b8c959179bd27d0b5431bcddeb28c881198cf9b51b0dd3ff0: Status 404 returned error can't find the container with id 8f22d36be3aa596b8c959179bd27d0b5431bcddeb28c881198cf9b51b0dd3ff0 Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.350601 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mckz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-jw6hs_openstack-operators(89b24774-f0eb-4d63-a124-1b244f195163): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.351800 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" podUID="89b24774-f0eb-4d63-a124-1b244f195163" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.352288 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9fvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-dsfvj_openstack-operators(134bd541-e4b0-4e84-b85d-a50c413d6cd2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.352960 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f"] Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.352975 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72be3389_d521_4742_9081_8bdc3aef0dc6.slice/crio-9acad7fa856ec9714260fd9b899febccab579ab326ee7459439105a472af1f03 WatchSource:0}: Error finding container 9acad7fa856ec9714260fd9b899febccab579ab326ee7459439105a472af1f03: Status 404 returned error can't find the container with id 9acad7fa856ec9714260fd9b899febccab579ab326ee7459439105a472af1f03 Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.353819 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" podUID="134bd541-e4b0-4e84-b85d-a50c413d6cd2" Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.355058 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b51009_d199_4b88_9158_1b7b3b1848d3.slice/crio-5fde6786ef19bd29363305d15bb28a0fc94920c6cea4f75bfb8b3e7ad4a17635 WatchSource:0}: Error finding container 5fde6786ef19bd29363305d15bb28a0fc94920c6cea4f75bfb8b3e7ad4a17635: Status 404 returned error can't find the container with id 5fde6786ef19bd29363305d15bb28a0fc94920c6cea4f75bfb8b3e7ad4a17635 Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.355386 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tvqfk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-kdxq5_openstack-operators(72be3389-d521-4742-9081-8bdc3aef0dc6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.356036 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14725449_2193_4b84_b736_31c04f9f43e4.slice/crio-86e86fba045a4569d15f5d7c953acdb473ad06521acebed3fb49d9d24a32976a WatchSource:0}: Error finding container 86e86fba045a4569d15f5d7c953acdb473ad06521acebed3fb49d9d24a32976a: Status 404 returned error can't find the container with id 86e86fba045a4569d15f5d7c953acdb473ad06521acebed3fb49d9d24a32976a Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.356067 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv"] Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.356613 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" podUID="72be3389-d521-4742-9081-8bdc3aef0dc6" Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.356836 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f019778_ba45_4e4a_a6d8_dd6d056aed3b.slice/crio-5163c92472a34273a4a6146950906872c5ea4ff041ecfed6ba4eccc1be022dfe WatchSource:0}: Error finding container 5163c92472a34273a4a6146950906872c5ea4ff041ecfed6ba4eccc1be022dfe: Status 404 returned error can't find the container with id 5163c92472a34273a4a6146950906872c5ea4ff041ecfed6ba4eccc1be022dfe Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.357846 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swknj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-8r8kv_openstack-operators(14725449-2193-4b84-b736-31c04f9f43e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.358137 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r4wbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-545456dc4-9nm28_openstack-operators(f5b51009-d199-4b88-9158-1b7b3b1848d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.358947 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" podUID="14725449-2193-4b84-b736-31c04f9f43e4" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.359108 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn"] Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.359361 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" podUID="f5b51009-d199-4b88-9158-1b7b3b1848d3" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.359404 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26vwj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-jht6f_openstack-operators(7f019778-ba45-4e4a-a6d8-dd6d056aed3b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: W0228 09:17:32.359744 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda7dfebc_ad65_4d02_a7f8_c10f9a6ac0d4.slice/crio-f7d0f527a5a9338f171a2d110e507f0182874d035d719f482949088ee84d8acb WatchSource:0}: Error finding container f7d0f527a5a9338f171a2d110e507f0182874d035d719f482949088ee84d8acb: Status 404 returned error can't find the container with id f7d0f527a5a9338f171a2d110e507f0182874d035d719f482949088ee84d8acb Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.360713 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" podUID="7f019778-ba45-4e4a-a6d8-dd6d056aed3b" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.362047 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9kch2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-p64nn_openstack-operators(da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.363700 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" podUID="da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.407448 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" event={"ID":"f5b51009-d199-4b88-9158-1b7b3b1848d3","Type":"ContainerStarted","Data":"5fde6786ef19bd29363305d15bb28a0fc94920c6cea4f75bfb8b3e7ad4a17635"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.409266 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" podUID="f5b51009-d199-4b88-9158-1b7b3b1848d3" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.411625 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" event={"ID":"41e8cac0-417a-4c1d-a31c-0389bdebd0ba","Type":"ContainerStarted","Data":"a3c499e0ac2e3ab3221d7a6295aa9a8d62fb3c1ddbaf84c38de1521a4d60267f"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.413383 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" event={"ID":"9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a","Type":"ContainerStarted","Data":"89df2b320e496f7fe5156a4f3b065341ba9de2f422271482d41f9d80b74c910d"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.415044 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" event={"ID":"da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4","Type":"ContainerStarted","Data":"f7d0f527a5a9338f171a2d110e507f0182874d035d719f482949088ee84d8acb"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.415913 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" podUID="da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.416111 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" event={"ID":"72be3389-d521-4742-9081-8bdc3aef0dc6","Type":"ContainerStarted","Data":"9acad7fa856ec9714260fd9b899febccab579ab326ee7459439105a472af1f03"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.417521 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" podUID="72be3389-d521-4742-9081-8bdc3aef0dc6" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.417940 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" event={"ID":"c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1","Type":"ContainerStarted","Data":"e454ddf80f8a464ed1be9afd5d2d2de26fc4af1a51364fd8eb234dff8c42d9e9"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.418954 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" event={"ID":"3ebd35dc-7a29-4c3f-b442-bfe29d833f06","Type":"ContainerStarted","Data":"a6a142bdcb6b4a6d18c14a4c78f794779965fc314a792f1b2345217b612a66aa"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.421311 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" event={"ID":"40ae4140-3768-425a-9791-234afb6297fe","Type":"ContainerStarted","Data":"2a1b9598a99d7437bccb7720af834ad3b687438aaf74c41912bbcead36ce0f7b"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.430819 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" event={"ID":"134bd541-e4b0-4e84-b85d-a50c413d6cd2","Type":"ContainerStarted","Data":"8f22d36be3aa596b8c959179bd27d0b5431bcddeb28c881198cf9b51b0dd3ff0"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.433619 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" podUID="134bd541-e4b0-4e84-b85d-a50c413d6cd2" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.435213 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" event={"ID":"a2ca8c5d-3391-4ae4-a451-8a14fe2352aa","Type":"ContainerStarted","Data":"3d363c416deaa69ba9c79c713eb69c94d076f7a39f0b97ed4e041d7fa07b9c3e"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.439454 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" event={"ID":"5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf","Type":"ContainerStarted","Data":"b25abf2fc13ec7e82cd42f04d313f77c19bfd52222ddd84d5967e11d7f69d6cf"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.446527 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" event={"ID":"30b87ec4-ee50-402d-8afc-a3f9241bbc4c","Type":"ContainerStarted","Data":"f2fd637308e92df2b6ad82f24885d546b4018a3ded995318fc4b9345c9c5d8a2"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.447516 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" event={"ID":"7f019778-ba45-4e4a-a6d8-dd6d056aed3b","Type":"ContainerStarted","Data":"5163c92472a34273a4a6146950906872c5ea4ff041ecfed6ba4eccc1be022dfe"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.452173 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" podUID="7f019778-ba45-4e4a-a6d8-dd6d056aed3b" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.457134 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" event={"ID":"dc30956e-12c6-4973-a99f-ae4b502abb17","Type":"ContainerStarted","Data":"aef1e1a0278cfe0c0f1b52416933e313960fbe97e493a5b19f6b2f63827cf8c2"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.463017 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.464072 4687 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.464197 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert podName:caa33de5-0fe2-4930-bf89-0f8ad6a96ca2 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:34.464175573 +0000 UTC m=+846.154744910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert") pod "infra-operator-controller-manager-f7fcc58b9-vqdm7" (UID: "caa33de5-0fe2-4930-bf89-0f8ad6a96ca2") : secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.479745 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" event={"ID":"14725449-2193-4b84-b736-31c04f9f43e4","Type":"ContainerStarted","Data":"86e86fba045a4569d15f5d7c953acdb473ad06521acebed3fb49d9d24a32976a"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.481416 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" podUID="14725449-2193-4b84-b736-31c04f9f43e4" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.481705 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" event={"ID":"89b24774-f0eb-4d63-a124-1b244f195163","Type":"ContainerStarted","Data":"0bff21523307d7237a7c3ddbf222def90186526326b12bfc590b7e8cb212982a"} Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.483599 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" podUID="89b24774-f0eb-4d63-a124-1b244f195163" Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.485423 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" event={"ID":"ccb38bca-46b2-4c3c-a6c5-d30af68435d1","Type":"ContainerStarted","Data":"44610e94f3a9f4c83bea526c10d11a9eede3eb73cba60e3f3321de2fd2540729"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.488815 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" event={"ID":"0e2af601-594d-47f7-95ef-0474051dae27","Type":"ContainerStarted","Data":"6d8a624b50a06318e6b047ed634022a32974a4439d0e5cd04307d24f1116a94b"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.491809 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" event={"ID":"09ff8e79-084a-4043-9061-c7007b041e86","Type":"ContainerStarted","Data":"35b1b124c920bf15c9e307a2a542332002419938c1b3a390a5e5adadf7db1f12"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.492824 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" event={"ID":"5945c472-0f03-4666-84ca-b8f4545db411","Type":"ContainerStarted","Data":"27e83781c1272861c13f6df69e6242fd5dda4e88504d11631d443e164d682425"} Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.565278 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.565575 4687 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.565631 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert podName:e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe nodeName:}" failed. No retries permitted until 2026-02-28 09:17:34.565614224 +0000 UTC m=+846.256183561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" (UID: "e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.978861 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.979259 4687 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.979493 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:34.979476195 +0000 UTC m=+846.670045533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "metrics-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: I0228 09:17:32.980005 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.980100 4687 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 28 09:17:32 crc kubenswrapper[4687]: E0228 09:17:32.980333 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:34.980325032 +0000 UTC m=+846.670894369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "webhook-server-cert" not found Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.502329 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" podUID="134bd541-e4b0-4e84-b85d-a50c413d6cd2" Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.502481 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" podUID="14725449-2193-4b84-b736-31c04f9f43e4" Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.504163 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" podUID="7f019778-ba45-4e4a-a6d8-dd6d056aed3b" Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.505401 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" podUID="72be3389-d521-4742-9081-8bdc3aef0dc6" Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.505784 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" podUID="89b24774-f0eb-4d63-a124-1b244f195163" Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.505787 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" podUID="da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4" Feb 28 09:17:33 crc kubenswrapper[4687]: E0228 09:17:33.505834 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" podUID="f5b51009-d199-4b88-9158-1b7b3b1848d3" Feb 28 09:17:34 crc kubenswrapper[4687]: I0228 09:17:34.504202 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:34 crc kubenswrapper[4687]: E0228 09:17:34.504439 4687 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:34 crc kubenswrapper[4687]: E0228 09:17:34.504709 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert podName:caa33de5-0fe2-4930-bf89-0f8ad6a96ca2 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:38.504678894 +0000 UTC m=+850.195248231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert") pod "infra-operator-controller-manager-f7fcc58b9-vqdm7" (UID: "caa33de5-0fe2-4930-bf89-0f8ad6a96ca2") : secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:34 crc kubenswrapper[4687]: I0228 09:17:34.607457 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:34 crc kubenswrapper[4687]: E0228 09:17:34.607651 4687 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:34 crc kubenswrapper[4687]: E0228 09:17:34.607732 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert podName:e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe nodeName:}" failed. No retries permitted until 2026-02-28 09:17:38.607712585 +0000 UTC m=+850.298281922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" (UID: "e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:35 crc kubenswrapper[4687]: I0228 09:17:35.013454 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:35 crc kubenswrapper[4687]: I0228 09:17:35.013528 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:35 crc kubenswrapper[4687]: E0228 09:17:35.013745 4687 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 28 09:17:35 crc kubenswrapper[4687]: E0228 09:17:35.013766 4687 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 28 09:17:35 crc kubenswrapper[4687]: E0228 09:17:35.013842 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:39.013820381 +0000 UTC m=+850.704389717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "metrics-server-cert" not found Feb 28 09:17:35 crc kubenswrapper[4687]: E0228 09:17:35.013866 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:39.013857931 +0000 UTC m=+850.704427267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "webhook-server-cert" not found Feb 28 09:17:38 crc kubenswrapper[4687]: I0228 09:17:38.575573 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:38 crc kubenswrapper[4687]: E0228 09:17:38.575744 4687 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:38 crc kubenswrapper[4687]: E0228 09:17:38.576014 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert podName:caa33de5-0fe2-4930-bf89-0f8ad6a96ca2 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:46.575997553 +0000 UTC m=+858.266566890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert") pod "infra-operator-controller-manager-f7fcc58b9-vqdm7" (UID: "caa33de5-0fe2-4930-bf89-0f8ad6a96ca2") : secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:38 crc kubenswrapper[4687]: I0228 09:17:38.680663 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:38 crc kubenswrapper[4687]: E0228 09:17:38.680860 4687 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:38 crc kubenswrapper[4687]: E0228 09:17:38.680928 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert podName:e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe nodeName:}" failed. No retries permitted until 2026-02-28 09:17:46.68091146 +0000 UTC m=+858.371480797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" (UID: "e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:39 crc kubenswrapper[4687]: I0228 09:17:39.087132 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:39 crc kubenswrapper[4687]: I0228 09:17:39.087594 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:39 crc kubenswrapper[4687]: E0228 09:17:39.087361 4687 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 28 09:17:39 crc kubenswrapper[4687]: E0228 09:17:39.087790 4687 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 28 09:17:39 crc kubenswrapper[4687]: E0228 09:17:39.087864 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:47.087838196 +0000 UTC m=+858.778407533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "webhook-server-cert" not found Feb 28 09:17:39 crc kubenswrapper[4687]: E0228 09:17:39.087883 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:17:47.087875206 +0000 UTC m=+858.778444542 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "metrics-server-cert" not found Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.550751 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" event={"ID":"30b87ec4-ee50-402d-8afc-a3f9241bbc4c","Type":"ContainerStarted","Data":"5e194dc21b6c0a832481b594624717e653d6167f8ad83d77654a4f10dba9e06c"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.551047 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.552176 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" event={"ID":"9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a","Type":"ContainerStarted","Data":"e5417d35eace0f257e599ad159c6bb585007bda8e834eef9533826b361c40865"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.552249 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.553785 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" event={"ID":"5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf","Type":"ContainerStarted","Data":"a4e4c4797e5bfe07e61cdb366fa482d9914d10238b6d1337c2c3b7b1bf1f2816"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.554219 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.555377 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" event={"ID":"ccb38bca-46b2-4c3c-a6c5-d30af68435d1","Type":"ContainerStarted","Data":"789fb47ecef1576f6faf56e73066892ca7fd15ea7c6c89df18218219cb982df7"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.555750 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.557153 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" event={"ID":"0e2af601-594d-47f7-95ef-0474051dae27","Type":"ContainerStarted","Data":"3c44e71e33fb9a440459165bb68dc7167060f0f96119bb41b2742ed95addc8bc"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.557501 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.559190 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" event={"ID":"5945c472-0f03-4666-84ca-b8f4545db411","Type":"ContainerStarted","Data":"5f412f7d6228c5b55a9fe5ffaab035d2890bf0b9137f26fbff95706ed855a4ac"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.559524 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.560678 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" event={"ID":"dc30956e-12c6-4973-a99f-ae4b502abb17","Type":"ContainerStarted","Data":"6a8a62efbafd530e41a35c37a06d03b80f345ed54394d5a2741999e37ca7a142"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.561128 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.563486 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" event={"ID":"41e8cac0-417a-4c1d-a31c-0389bdebd0ba","Type":"ContainerStarted","Data":"96414b66f786d71a4a322f2f9a8d6a77816bb03cd2a5ae67e699db4292a37eb4"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.563823 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.564920 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" event={"ID":"3ebd35dc-7a29-4c3f-b442-bfe29d833f06","Type":"ContainerStarted","Data":"96f0c3781745d74ffb66f5ef96075cc0a7b989f0e967c87fb7dc9546301f48f9"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.565276 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.566647 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" event={"ID":"c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1","Type":"ContainerStarted","Data":"f4d29a4312fc5d6200c1ec6d43fb70c8bfb811d62d06b38450b33989c149c6a1"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.566970 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.567954 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" event={"ID":"40ae4140-3768-425a-9791-234afb6297fe","Type":"ContainerStarted","Data":"70c7b4e1ac1a5e26c32397afbd9cd270b08f96a970d1be06a0323363971a1654"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.568300 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.569232 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" event={"ID":"09ff8e79-084a-4043-9061-c7007b041e86","Type":"ContainerStarted","Data":"e1502d0dc46de1e2adfb9f0d02a8963ad08c60bb89ce900e0c7a53c70af2195c"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.569547 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.570517 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" event={"ID":"a2ca8c5d-3391-4ae4-a451-8a14fe2352aa","Type":"ContainerStarted","Data":"a5eb5aa6e3b13350c55c157d128e8066b17a52713c17acc077bd27bc3da9a02b"} Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.570834 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.613115 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" podStartSLOduration=2.598873726 podStartE2EDuration="10.613094696s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.952056086 +0000 UTC m=+843.642625423" lastFinishedPulling="2026-02-28 09:17:39.966277055 +0000 UTC m=+851.656846393" observedRunningTime="2026-02-28 09:17:40.61068317 +0000 UTC m=+852.301252508" watchObservedRunningTime="2026-02-28 09:17:40.613094696 +0000 UTC m=+852.303664034" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.615745 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" podStartSLOduration=2.380692157 podStartE2EDuration="10.615739571s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.730464512 +0000 UTC m=+843.421033850" lastFinishedPulling="2026-02-28 09:17:39.965511927 +0000 UTC m=+851.656081264" observedRunningTime="2026-02-28 09:17:40.587868064 +0000 UTC m=+852.278437401" watchObservedRunningTime="2026-02-28 09:17:40.615739571 +0000 UTC m=+852.306308909" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.633191 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" podStartSLOduration=1.986743376 podStartE2EDuration="9.633173827s" podCreationTimestamp="2026-02-28 09:17:31 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.317009609 +0000 UTC m=+844.007578945" lastFinishedPulling="2026-02-28 09:17:39.963440059 +0000 UTC m=+851.654009396" observedRunningTime="2026-02-28 09:17:40.631052175 +0000 UTC m=+852.321621512" watchObservedRunningTime="2026-02-28 09:17:40.633173827 +0000 UTC m=+852.323743164" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.654174 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" podStartSLOduration=3.011456071 podStartE2EDuration="10.654160302s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.341211183 +0000 UTC m=+844.031780520" lastFinishedPulling="2026-02-28 09:17:39.983915414 +0000 UTC m=+851.674484751" observedRunningTime="2026-02-28 09:17:40.651983938 +0000 UTC m=+852.342553275" watchObservedRunningTime="2026-02-28 09:17:40.654160302 +0000 UTC m=+852.344729639" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.681354 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" podStartSLOduration=3.035586761 podStartE2EDuration="10.681335069s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.332436326 +0000 UTC m=+844.023005663" lastFinishedPulling="2026-02-28 09:17:39.978184635 +0000 UTC m=+851.668753971" observedRunningTime="2026-02-28 09:17:40.678174564 +0000 UTC m=+852.368743901" watchObservedRunningTime="2026-02-28 09:17:40.681335069 +0000 UTC m=+852.371904406" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.704788 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" podStartSLOduration=2.532777918 podStartE2EDuration="10.704771906s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.793537055 +0000 UTC m=+843.484106392" lastFinishedPulling="2026-02-28 09:17:39.965531043 +0000 UTC m=+851.656100380" observedRunningTime="2026-02-28 09:17:40.692419159 +0000 UTC m=+852.382988496" watchObservedRunningTime="2026-02-28 09:17:40.704771906 +0000 UTC m=+852.395341243" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.728286 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" podStartSLOduration=2.702096883 podStartE2EDuration="10.728269324s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.940063988 +0000 UTC m=+843.630633325" lastFinishedPulling="2026-02-28 09:17:39.966236429 +0000 UTC m=+851.656805766" observedRunningTime="2026-02-28 09:17:40.727892947 +0000 UTC m=+852.418462283" watchObservedRunningTime="2026-02-28 09:17:40.728269324 +0000 UTC m=+852.418838661" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.757521 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" podStartSLOduration=3.091557644 podStartE2EDuration="10.757504928s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.311769271 +0000 UTC m=+844.002338597" lastFinishedPulling="2026-02-28 09:17:39.977716543 +0000 UTC m=+851.668285881" observedRunningTime="2026-02-28 09:17:40.751799816 +0000 UTC m=+852.442369153" watchObservedRunningTime="2026-02-28 09:17:40.757504928 +0000 UTC m=+852.448074264" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.809637 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" podStartSLOduration=3.160493164 podStartE2EDuration="10.809615049s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.317718302 +0000 UTC m=+844.008287638" lastFinishedPulling="2026-02-28 09:17:39.966840186 +0000 UTC m=+851.657409523" observedRunningTime="2026-02-28 09:17:40.809387621 +0000 UTC m=+852.499956968" watchObservedRunningTime="2026-02-28 09:17:40.809615049 +0000 UTC m=+852.500184386" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.820940 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" podStartSLOduration=2.785359423 podStartE2EDuration="10.820921607s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.927621333 +0000 UTC m=+843.618190670" lastFinishedPulling="2026-02-28 09:17:39.963183527 +0000 UTC m=+851.653752854" observedRunningTime="2026-02-28 09:17:40.778057889 +0000 UTC m=+852.468627226" watchObservedRunningTime="2026-02-28 09:17:40.820921607 +0000 UTC m=+852.511491125" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.870731 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" podStartSLOduration=2.832768142 podStartE2EDuration="10.870713207s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.928554117 +0000 UTC m=+843.619123455" lastFinishedPulling="2026-02-28 09:17:39.966499183 +0000 UTC m=+851.657068520" observedRunningTime="2026-02-28 09:17:40.867655756 +0000 UTC m=+852.558225093" watchObservedRunningTime="2026-02-28 09:17:40.870713207 +0000 UTC m=+852.561282545" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.873326 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" podStartSLOduration=3.213558494 podStartE2EDuration="10.873321364s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.326809332 +0000 UTC m=+844.017378669" lastFinishedPulling="2026-02-28 09:17:39.986572202 +0000 UTC m=+851.677141539" observedRunningTime="2026-02-28 09:17:40.846834501 +0000 UTC m=+852.537403838" watchObservedRunningTime="2026-02-28 09:17:40.873321364 +0000 UTC m=+852.563890701" Feb 28 09:17:40 crc kubenswrapper[4687]: I0228 09:17:40.891287 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" podStartSLOduration=2.767342603 podStartE2EDuration="10.8912727s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:31.8391373 +0000 UTC m=+843.529706638" lastFinishedPulling="2026-02-28 09:17:39.963067399 +0000 UTC m=+851.653636735" observedRunningTime="2026-02-28 09:17:40.888543557 +0000 UTC m=+852.579112894" watchObservedRunningTime="2026-02-28 09:17:40.8912727 +0000 UTC m=+852.581842037" Feb 28 09:17:46 crc kubenswrapper[4687]: I0228 09:17:46.598048 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:17:46 crc kubenswrapper[4687]: E0228 09:17:46.598280 4687 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:46 crc kubenswrapper[4687]: E0228 09:17:46.598778 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert podName:caa33de5-0fe2-4930-bf89-0f8ad6a96ca2 nodeName:}" failed. No retries permitted until 2026-02-28 09:18:02.598758717 +0000 UTC m=+874.289328054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert") pod "infra-operator-controller-manager-f7fcc58b9-vqdm7" (UID: "caa33de5-0fe2-4930-bf89-0f8ad6a96ca2") : secret "infra-operator-webhook-server-cert" not found Feb 28 09:17:46 crc kubenswrapper[4687]: I0228 09:17:46.699484 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:17:46 crc kubenswrapper[4687]: E0228 09:17:46.699706 4687 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:46 crc kubenswrapper[4687]: E0228 09:17:46.699760 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert podName:e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe nodeName:}" failed. No retries permitted until 2026-02-28 09:18:02.699744836 +0000 UTC m=+874.390314173 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" (UID: "e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 28 09:17:47 crc kubenswrapper[4687]: I0228 09:17:47.106247 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:47 crc kubenswrapper[4687]: I0228 09:17:47.106300 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:17:47 crc kubenswrapper[4687]: E0228 09:17:47.106441 4687 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 28 09:17:47 crc kubenswrapper[4687]: E0228 09:17:47.106490 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:18:03.106475213 +0000 UTC m=+874.797044550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "webhook-server-cert" not found Feb 28 09:17:47 crc kubenswrapper[4687]: E0228 09:17:47.106828 4687 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 28 09:17:47 crc kubenswrapper[4687]: E0228 09:17:47.106921 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs podName:005ef854-8015-4724-b7b1-42f8fe9a1497 nodeName:}" failed. No retries permitted until 2026-02-28 09:18:03.106899932 +0000 UTC m=+874.797469269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-72kg5" (UID: "005ef854-8015-4724-b7b1-42f8fe9a1497") : secret "metrics-server-cert" not found Feb 28 09:17:47 crc kubenswrapper[4687]: I0228 09:17:47.626297 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" event={"ID":"f5b51009-d199-4b88-9158-1b7b3b1848d3","Type":"ContainerStarted","Data":"e21e6e5ff92dba4e776648ac6f1e5a8f3577c31ed6867e879d4bbdc80b887fd5"} Feb 28 09:17:47 crc kubenswrapper[4687]: I0228 09:17:47.626740 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:47 crc kubenswrapper[4687]: I0228 09:17:47.640535 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" podStartSLOduration=3.362979021 podStartE2EDuration="17.640516557s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.358045418 +0000 UTC m=+844.048614756" lastFinishedPulling="2026-02-28 09:17:46.635582955 +0000 UTC m=+858.326152292" observedRunningTime="2026-02-28 09:17:47.638529849 +0000 UTC m=+859.329099207" watchObservedRunningTime="2026-02-28 09:17:47.640516557 +0000 UTC m=+859.331085893" Feb 28 09:17:50 crc kubenswrapper[4687]: I0228 09:17:50.865589 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-jtdtt" Feb 28 09:17:50 crc kubenswrapper[4687]: I0228 09:17:50.876074 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-7wrs7" Feb 28 09:17:50 crc kubenswrapper[4687]: I0228 09:17:50.898417 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9zkzk" Feb 28 09:17:50 crc kubenswrapper[4687]: I0228 09:17:50.966496 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ltpvl" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.020734 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-v9vbd" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.052903 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-9nm28" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.064212 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-jbzlm" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.090221 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hsvs9" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.272324 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-9fpjj" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.307432 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-q5zdg" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.331802 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fxqv8" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.371748 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-2t7hs" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.482897 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-chfpl" Feb 28 09:17:51 crc kubenswrapper[4687]: I0228 09:17:51.517110 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-c92d5" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.708094 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" event={"ID":"7f019778-ba45-4e4a-a6d8-dd6d056aed3b","Type":"ContainerStarted","Data":"9aea6137b1ab3aff5eb7f749b2d2504115a71e32cb2f8485a64f319e1d6e8684"} Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.708641 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.709627 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" event={"ID":"134bd541-e4b0-4e84-b85d-a50c413d6cd2","Type":"ContainerStarted","Data":"e1e861a0d81c1f68fc947d264b60f14a6da6c737f85988b33a261f4a0d97703e"} Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.709782 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.711859 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" event={"ID":"da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4","Type":"ContainerStarted","Data":"eea045ade64983ee6df6d52a31088bd09762a60269339b2a8c747b74a18b2abe"} Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.713524 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" event={"ID":"72be3389-d521-4742-9081-8bdc3aef0dc6","Type":"ContainerStarted","Data":"46f7a774da4499461e4dd0c1bd885231c3c909cb1f4daa0c968a28c64d69725e"} Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.713684 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.715284 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" event={"ID":"14725449-2193-4b84-b736-31c04f9f43e4","Type":"ContainerStarted","Data":"aff5f001d2318450574b571932fb7dbbb1d35d099f597af5886265cf3e277c0e"} Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.715531 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.716802 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" event={"ID":"89b24774-f0eb-4d63-a124-1b244f195163","Type":"ContainerStarted","Data":"d8a4512bf587e582ae30cb3a797551dd58c0a939f0195de16ea6c36ffd173d58"} Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.717064 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.725402 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" podStartSLOduration=3.270890867 podStartE2EDuration="27.725391612s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.359321278 +0000 UTC m=+844.049890615" lastFinishedPulling="2026-02-28 09:17:56.813822023 +0000 UTC m=+868.504391360" observedRunningTime="2026-02-28 09:17:57.724258732 +0000 UTC m=+869.414828098" watchObservedRunningTime="2026-02-28 09:17:57.725391612 +0000 UTC m=+869.415960950" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.742503 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p64nn" podStartSLOduration=2.263876585 podStartE2EDuration="26.742489214s" podCreationTimestamp="2026-02-28 09:17:31 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.361950354 +0000 UTC m=+844.052519691" lastFinishedPulling="2026-02-28 09:17:56.840562984 +0000 UTC m=+868.531132320" observedRunningTime="2026-02-28 09:17:57.735258834 +0000 UTC m=+869.425828171" watchObservedRunningTime="2026-02-28 09:17:57.742489214 +0000 UTC m=+869.433058551" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.750642 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" podStartSLOduration=3.262471632 podStartE2EDuration="27.750627613s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.35212929 +0000 UTC m=+844.042698627" lastFinishedPulling="2026-02-28 09:17:56.840285271 +0000 UTC m=+868.530854608" observedRunningTime="2026-02-28 09:17:57.74752216 +0000 UTC m=+869.438091488" watchObservedRunningTime="2026-02-28 09:17:57.750627613 +0000 UTC m=+869.441196949" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.787839 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" podStartSLOduration=3.338990756 podStartE2EDuration="27.787825473s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.355271921 +0000 UTC m=+844.045841258" lastFinishedPulling="2026-02-28 09:17:56.804106637 +0000 UTC m=+868.494675975" observedRunningTime="2026-02-28 09:17:57.76587802 +0000 UTC m=+869.456447356" watchObservedRunningTime="2026-02-28 09:17:57.787825473 +0000 UTC m=+869.478394810" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.788510 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" podStartSLOduration=3.326336935 podStartE2EDuration="27.788503638s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.350457365 +0000 UTC m=+844.041026702" lastFinishedPulling="2026-02-28 09:17:56.812624069 +0000 UTC m=+868.503193405" observedRunningTime="2026-02-28 09:17:57.78451696 +0000 UTC m=+869.475086297" watchObservedRunningTime="2026-02-28 09:17:57.788503638 +0000 UTC m=+869.479072976" Feb 28 09:17:57 crc kubenswrapper[4687]: I0228 09:17:57.799119 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" podStartSLOduration=3.373183034 podStartE2EDuration="27.799106392s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:17:32.357735185 +0000 UTC m=+844.048304522" lastFinishedPulling="2026-02-28 09:17:56.783658544 +0000 UTC m=+868.474227880" observedRunningTime="2026-02-28 09:17:57.795969302 +0000 UTC m=+869.486538639" watchObservedRunningTime="2026-02-28 09:17:57.799106392 +0000 UTC m=+869.489675720" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.140817 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537838-tpskf"] Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.141936 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.143904 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.144107 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.144454 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.151821 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537838-tpskf"] Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.234127 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6l2g\" (UniqueName: \"kubernetes.io/projected/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3-kube-api-access-x6l2g\") pod \"auto-csr-approver-29537838-tpskf\" (UID: \"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3\") " pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.335873 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6l2g\" (UniqueName: \"kubernetes.io/projected/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3-kube-api-access-x6l2g\") pod \"auto-csr-approver-29537838-tpskf\" (UID: \"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3\") " pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.354154 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6l2g\" (UniqueName: \"kubernetes.io/projected/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3-kube-api-access-x6l2g\") pod \"auto-csr-approver-29537838-tpskf\" (UID: \"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3\") " pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.455637 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:00 crc kubenswrapper[4687]: I0228 09:18:00.846881 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537838-tpskf"] Feb 28 09:18:00 crc kubenswrapper[4687]: W0228 09:18:00.853009 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod431f48d9_5c93_4c3e_b2e4_bdb74b8945e3.slice/crio-bd5cecd800208534e64feee193fb2fc507f4c41fb82d22d3188dcec8266ef226 WatchSource:0}: Error finding container bd5cecd800208534e64feee193fb2fc507f4c41fb82d22d3188dcec8266ef226: Status 404 returned error can't find the container with id bd5cecd800208534e64feee193fb2fc507f4c41fb82d22d3188dcec8266ef226 Feb 28 09:18:01 crc kubenswrapper[4687]: I0228 09:18:01.743688 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537838-tpskf" event={"ID":"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3","Type":"ContainerStarted","Data":"bd5cecd800208534e64feee193fb2fc507f4c41fb82d22d3188dcec8266ef226"} Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.672361 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.678038 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa33de5-0fe2-4930-bf89-0f8ad6a96ca2-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vqdm7\" (UID: \"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.751379 4687 generic.go:334] "Generic (PLEG): container finished" podID="431f48d9-5c93-4c3e-b2e4-bdb74b8945e3" containerID="43d44663884a598f4ca038e01e490f16d11416c524dc79aa1973d329172ede08" exitCode=0 Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.751495 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537838-tpskf" event={"ID":"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3","Type":"ContainerDied","Data":"43d44663884a598f4ca038e01e490f16d11416c524dc79aa1973d329172ede08"} Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.774090 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.779592 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776925xf7\" (UID: \"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:18:02 crc kubenswrapper[4687]: I0228 09:18:02.824091 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.047434 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.182188 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.182510 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.186881 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.186913 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/005ef854-8015-4724-b7b1-42f8fe9a1497-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-72kg5\" (UID: \"005ef854-8015-4724-b7b1-42f8fe9a1497\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.204748 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7"] Feb 28 09:18:03 crc kubenswrapper[4687]: W0228 09:18:03.210369 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa33de5_0fe2_4930_bf89_0f8ad6a96ca2.slice/crio-6afd28254ea5e7061fe5ac4dd27f07c11f8ec602e966bca21ec28518eba385fd WatchSource:0}: Error finding container 6afd28254ea5e7061fe5ac4dd27f07c11f8ec602e966bca21ec28518eba385fd: Status 404 returned error can't find the container with id 6afd28254ea5e7061fe5ac4dd27f07c11f8ec602e966bca21ec28518eba385fd Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.386282 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.440198 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7"] Feb 28 09:18:03 crc kubenswrapper[4687]: W0228 09:18:03.443586 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f23b9a_0cdb_4cc2_865d_49e56d8fdebe.slice/crio-7b65b386f772d8767860a5925b4b0afa59cd618f98e0ac4ef172c105a5b5dff1 WatchSource:0}: Error finding container 7b65b386f772d8767860a5925b4b0afa59cd618f98e0ac4ef172c105a5b5dff1: Status 404 returned error can't find the container with id 7b65b386f772d8767860a5925b4b0afa59cd618f98e0ac4ef172c105a5b5dff1 Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.766683 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" event={"ID":"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe","Type":"ContainerStarted","Data":"7b65b386f772d8767860a5925b4b0afa59cd618f98e0ac4ef172c105a5b5dff1"} Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.769363 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" event={"ID":"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2","Type":"ContainerStarted","Data":"6afd28254ea5e7061fe5ac4dd27f07c11f8ec602e966bca21ec28518eba385fd"} Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.789663 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5"] Feb 28 09:18:03 crc kubenswrapper[4687]: W0228 09:18:03.793229 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod005ef854_8015_4724_b7b1_42f8fe9a1497.slice/crio-9b587b7bb316217376377938bc1026f9d97f32378907f50f1758cc4a8ac205ee WatchSource:0}: Error finding container 9b587b7bb316217376377938bc1026f9d97f32378907f50f1758cc4a8ac205ee: Status 404 returned error can't find the container with id 9b587b7bb316217376377938bc1026f9d97f32378907f50f1758cc4a8ac205ee Feb 28 09:18:03 crc kubenswrapper[4687]: I0228 09:18:03.994532 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.096976 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6l2g\" (UniqueName: \"kubernetes.io/projected/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3-kube-api-access-x6l2g\") pod \"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3\" (UID: \"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3\") " Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.102268 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3-kube-api-access-x6l2g" (OuterVolumeSpecName: "kube-api-access-x6l2g") pod "431f48d9-5c93-4c3e-b2e4-bdb74b8945e3" (UID: "431f48d9-5c93-4c3e-b2e4-bdb74b8945e3"). InnerVolumeSpecName "kube-api-access-x6l2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.199030 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6l2g\" (UniqueName: \"kubernetes.io/projected/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3-kube-api-access-x6l2g\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.779250 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537838-tpskf" Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.779239 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537838-tpskf" event={"ID":"431f48d9-5c93-4c3e-b2e4-bdb74b8945e3","Type":"ContainerDied","Data":"bd5cecd800208534e64feee193fb2fc507f4c41fb82d22d3188dcec8266ef226"} Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.779321 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5cecd800208534e64feee193fb2fc507f4c41fb82d22d3188dcec8266ef226" Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.785134 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" event={"ID":"005ef854-8015-4724-b7b1-42f8fe9a1497","Type":"ContainerStarted","Data":"06a45cf4e7e3f593f6918aee0ad7ffdf74fbb41c2638713eecfdcb687446c8ba"} Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.785174 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" event={"ID":"005ef854-8015-4724-b7b1-42f8fe9a1497","Type":"ContainerStarted","Data":"9b587b7bb316217376377938bc1026f9d97f32378907f50f1758cc4a8ac205ee"} Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.785341 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:04 crc kubenswrapper[4687]: I0228 09:18:04.814368 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" podStartSLOduration=33.814347240000004 podStartE2EDuration="33.81434724s" podCreationTimestamp="2026-02-28 09:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:18:04.808441832 +0000 UTC m=+876.499011169" watchObservedRunningTime="2026-02-28 09:18:04.81434724 +0000 UTC m=+876.504916577" Feb 28 09:18:05 crc kubenswrapper[4687]: I0228 09:18:05.039846 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537832-gnr45"] Feb 28 09:18:05 crc kubenswrapper[4687]: I0228 09:18:05.043898 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537832-gnr45"] Feb 28 09:18:05 crc kubenswrapper[4687]: I0228 09:18:05.799902 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" event={"ID":"caa33de5-0fe2-4930-bf89-0f8ad6a96ca2","Type":"ContainerStarted","Data":"41abd526c881936d84d750da041b962a8ad2d05385c8f57300ec1f4be9eab98d"} Feb 28 09:18:05 crc kubenswrapper[4687]: I0228 09:18:05.800371 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:18:05 crc kubenswrapper[4687]: I0228 09:18:05.808561 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" event={"ID":"e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe","Type":"ContainerStarted","Data":"6b3018259b8db66cf68ef8ddc66445c3124ee97097dcb61b585fa65a2840af7a"} Feb 28 09:18:05 crc kubenswrapper[4687]: I0228 09:18:05.828679 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" podStartSLOduration=33.430998764 podStartE2EDuration="35.828662799s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:18:03.213171002 +0000 UTC m=+874.903740338" lastFinishedPulling="2026-02-28 09:18:05.610835036 +0000 UTC m=+877.301404373" observedRunningTime="2026-02-28 09:18:05.82504282 +0000 UTC m=+877.515612147" watchObservedRunningTime="2026-02-28 09:18:05.828662799 +0000 UTC m=+877.519232137" Feb 28 09:18:06 crc kubenswrapper[4687]: I0228 09:18:06.664646 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd97b565-7b98-4a48-8ff8-ed015b66da45" path="/var/lib/kubelet/pods/fd97b565-7b98-4a48-8ff8-ed015b66da45/volumes" Feb 28 09:18:06 crc kubenswrapper[4687]: I0228 09:18:06.815455 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:18:11 crc kubenswrapper[4687]: I0228 09:18:11.051127 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-8r8kv" Feb 28 09:18:11 crc kubenswrapper[4687]: I0228 09:18:11.066794 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" podStartSLOduration=38.899451048 podStartE2EDuration="41.066759625s" podCreationTimestamp="2026-02-28 09:17:30 +0000 UTC" firstStartedPulling="2026-02-28 09:18:03.447582299 +0000 UTC m=+875.138151636" lastFinishedPulling="2026-02-28 09:18:05.614890876 +0000 UTC m=+877.305460213" observedRunningTime="2026-02-28 09:18:05.861398016 +0000 UTC m=+877.551967353" watchObservedRunningTime="2026-02-28 09:18:11.066759625 +0000 UTC m=+882.757328962" Feb 28 09:18:11 crc kubenswrapper[4687]: I0228 09:18:11.068419 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jw6hs" Feb 28 09:18:11 crc kubenswrapper[4687]: I0228 09:18:11.111535 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-dsfvj" Feb 28 09:18:11 crc kubenswrapper[4687]: I0228 09:18:11.120237 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-kdxq5" Feb 28 09:18:11 crc kubenswrapper[4687]: I0228 09:18:11.297101 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jht6f" Feb 28 09:18:12 crc kubenswrapper[4687]: I0228 09:18:12.829622 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vqdm7" Feb 28 09:18:13 crc kubenswrapper[4687]: I0228 09:18:13.054393 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776925xf7" Feb 28 09:18:13 crc kubenswrapper[4687]: I0228 09:18:13.392496 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-72kg5" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.751226 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-9zft7"] Feb 28 09:18:25 crc kubenswrapper[4687]: E0228 09:18:25.751956 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431f48d9-5c93-4c3e-b2e4-bdb74b8945e3" containerName="oc" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.751976 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="431f48d9-5c93-4c3e-b2e4-bdb74b8945e3" containerName="oc" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.752140 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="431f48d9-5c93-4c3e-b2e4-bdb74b8945e3" containerName="oc" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.752773 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.754809 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6vh59" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.755810 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.757189 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.757710 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.768787 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-9zft7"] Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.840337 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqz4\" (UniqueName: \"kubernetes.io/projected/d0151919-e58a-406d-939d-d88c8103e6f8-kube-api-access-qrqz4\") pod \"dnsmasq-dns-589db6c89c-9zft7\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.840386 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0151919-e58a-406d-939d-d88c8103e6f8-config\") pod \"dnsmasq-dns-589db6c89c-9zft7\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.899606 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-l2vmx"] Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.900589 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.902637 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.913430 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-l2vmx"] Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.941123 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqz4\" (UniqueName: \"kubernetes.io/projected/d0151919-e58a-406d-939d-d88c8103e6f8-kube-api-access-qrqz4\") pod \"dnsmasq-dns-589db6c89c-9zft7\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.941168 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0151919-e58a-406d-939d-d88c8103e6f8-config\") pod \"dnsmasq-dns-589db6c89c-9zft7\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.941979 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0151919-e58a-406d-939d-d88c8103e6f8-config\") pod \"dnsmasq-dns-589db6c89c-9zft7\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:25 crc kubenswrapper[4687]: I0228 09:18:25.965792 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqz4\" (UniqueName: \"kubernetes.io/projected/d0151919-e58a-406d-939d-d88c8103e6f8-kube-api-access-qrqz4\") pod \"dnsmasq-dns-589db6c89c-9zft7\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.042795 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.042913 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-config\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.042977 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwb4\" (UniqueName: \"kubernetes.io/projected/e5cd53cf-b205-4c6c-92be-155def921e74-kube-api-access-jlwb4\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.067382 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.143618 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.143836 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-config\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.143883 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwb4\" (UniqueName: \"kubernetes.io/projected/e5cd53cf-b205-4c6c-92be-155def921e74-kube-api-access-jlwb4\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.145108 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.145190 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-config\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.158962 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwb4\" (UniqueName: \"kubernetes.io/projected/e5cd53cf-b205-4c6c-92be-155def921e74-kube-api-access-jlwb4\") pod \"dnsmasq-dns-86bbd886cf-l2vmx\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.212431 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.440497 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-9zft7"] Feb 28 09:18:26 crc kubenswrapper[4687]: W0228 09:18:26.443159 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0151919_e58a_406d_939d_d88c8103e6f8.slice/crio-013ba73f46b43155d8989e1496589efacfad762a3193ca35051d097c6558f997 WatchSource:0}: Error finding container 013ba73f46b43155d8989e1496589efacfad762a3193ca35051d097c6558f997: Status 404 returned error can't find the container with id 013ba73f46b43155d8989e1496589efacfad762a3193ca35051d097c6558f997 Feb 28 09:18:26 crc kubenswrapper[4687]: W0228 09:18:26.586288 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5cd53cf_b205_4c6c_92be_155def921e74.slice/crio-3461489647dfeabdee2c2b6cc6f95bbf08c37ac56e129c63fa811ccfb12f10a6 WatchSource:0}: Error finding container 3461489647dfeabdee2c2b6cc6f95bbf08c37ac56e129c63fa811ccfb12f10a6: Status 404 returned error can't find the container with id 3461489647dfeabdee2c2b6cc6f95bbf08c37ac56e129c63fa811ccfb12f10a6 Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.586640 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-l2vmx"] Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.954385 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" event={"ID":"e5cd53cf-b205-4c6c-92be-155def921e74","Type":"ContainerStarted","Data":"3461489647dfeabdee2c2b6cc6f95bbf08c37ac56e129c63fa811ccfb12f10a6"} Feb 28 09:18:26 crc kubenswrapper[4687]: I0228 09:18:26.956078 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" event={"ID":"d0151919-e58a-406d-939d-d88c8103e6f8","Type":"ContainerStarted","Data":"013ba73f46b43155d8989e1496589efacfad762a3193ca35051d097c6558f997"} Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.720846 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-9zft7"] Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.771617 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-qzps4"] Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.772823 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.816514 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-qzps4"] Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.889340 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr95z\" (UniqueName: \"kubernetes.io/projected/f6bffbc2-0283-4286-9d05-2b60186e0740-kube-api-access-zr95z\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.889416 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.889447 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-config\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.991942 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.992003 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-config\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.992089 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr95z\" (UniqueName: \"kubernetes.io/projected/f6bffbc2-0283-4286-9d05-2b60186e0740-kube-api-access-zr95z\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.992933 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-config\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:28 crc kubenswrapper[4687]: I0228 09:18:28.993115 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.012550 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr95z\" (UniqueName: \"kubernetes.io/projected/f6bffbc2-0283-4286-9d05-2b60186e0740-kube-api-access-zr95z\") pod \"dnsmasq-dns-78cb4465c9-qzps4\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.070799 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-l2vmx"] Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.085992 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-2qbft"] Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.087383 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.100308 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.101829 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-2qbft"] Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.195048 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg4j5\" (UniqueName: \"kubernetes.io/projected/f611fd7a-502d-4db5-ad7f-eae15ccd9486-kube-api-access-jg4j5\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.195397 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-config\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.195433 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.297350 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg4j5\" (UniqueName: \"kubernetes.io/projected/f611fd7a-502d-4db5-ad7f-eae15ccd9486-kube-api-access-jg4j5\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.297431 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.297452 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-config\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.299063 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.299762 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-config\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.336349 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg4j5\" (UniqueName: \"kubernetes.io/projected/f611fd7a-502d-4db5-ad7f-eae15ccd9486-kube-api-access-jg4j5\") pod \"dnsmasq-dns-7c47bcb9f9-2qbft\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.434958 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.625668 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-qzps4"] Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.952998 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.954253 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.955989 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.956910 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.957144 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.957538 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.958046 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.965758 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.966084 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lv99p" Feb 28 09:18:29 crc kubenswrapper[4687]: I0228 09:18:29.996042 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108486 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/171eb8fe-deaf-4936-b51d-de02b4131b8b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108626 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108653 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/171eb8fe-deaf-4936-b51d-de02b4131b8b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108774 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108835 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108876 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.108922 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv7rs\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-kube-api-access-xv7rs\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.109044 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.109075 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.109127 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.109159 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212611 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212665 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212717 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/171eb8fe-deaf-4936-b51d-de02b4131b8b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212756 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212778 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/171eb8fe-deaf-4936-b51d-de02b4131b8b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212795 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212822 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212838 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212858 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv7rs\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-kube-api-access-xv7rs\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212892 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.212914 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.213424 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.214351 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.214862 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.214876 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.214912 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.215167 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.221569 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.227852 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.232045 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.234485 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.237396 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.237639 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.237725 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.237951 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.238106 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.238951 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-khs6z" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.239154 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.242191 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/171eb8fe-deaf-4936-b51d-de02b4131b8b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.242244 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/171eb8fe-deaf-4936-b51d-de02b4131b8b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.245763 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.245811 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv7rs\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-kube-api-access-xv7rs\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.246282 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.296195 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.314897 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-server-conf\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.314966 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.314997 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315015 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-config-data\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315058 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315089 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/541f5799-4b5e-4767-aca7-8c3738502a06-pod-info\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315113 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315170 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315197 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/541f5799-4b5e-4767-aca7-8c3738502a06-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315229 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.315249 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp667\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-kube-api-access-jp667\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419058 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-server-conf\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419134 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419159 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419204 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-config-data\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419231 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419256 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/541f5799-4b5e-4767-aca7-8c3738502a06-pod-info\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419279 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419325 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419356 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/541f5799-4b5e-4767-aca7-8c3738502a06-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419387 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.419404 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp667\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-kube-api-access-jp667\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.421792 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-server-conf\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.422183 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.423109 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.425257 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.427587 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-config-data\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.427736 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.430500 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/541f5799-4b5e-4767-aca7-8c3738502a06-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.430937 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.431355 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.436631 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp667\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-kube-api-access-jp667\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.440077 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/541f5799-4b5e-4767-aca7-8c3738502a06-pod-info\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.450686 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " pod="openstack/rabbitmq-server-0" Feb 28 09:18:30 crc kubenswrapper[4687]: I0228 09:18:30.587319 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.241291 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.242451 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.245549 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.245895 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wjw29" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.248346 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.251402 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.254988 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.258482 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336478 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1fac181-ae33-45e1-8171-1d998d59bc04-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336564 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7cp\" (UniqueName: \"kubernetes.io/projected/c1fac181-ae33-45e1-8171-1d998d59bc04-kube-api-access-dr7cp\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336602 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336758 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336815 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fac181-ae33-45e1-8171-1d998d59bc04-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336863 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-config-data-default\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336945 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-kolla-config\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.336995 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1fac181-ae33-45e1-8171-1d998d59bc04-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438029 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-config-data-default\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438098 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-kolla-config\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438256 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1fac181-ae33-45e1-8171-1d998d59bc04-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438291 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1fac181-ae33-45e1-8171-1d998d59bc04-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438346 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7cp\" (UniqueName: \"kubernetes.io/projected/c1fac181-ae33-45e1-8171-1d998d59bc04-kube-api-access-dr7cp\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438378 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438450 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.438485 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fac181-ae33-45e1-8171-1d998d59bc04-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.443178 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1fac181-ae33-45e1-8171-1d998d59bc04-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.443836 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-config-data-default\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.444340 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-kolla-config\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.444616 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1fac181-ae33-45e1-8171-1d998d59bc04-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.444634 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fac181-ae33-45e1-8171-1d998d59bc04-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.444708 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.461152 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fac181-ae33-45e1-8171-1d998d59bc04-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.476961 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7cp\" (UniqueName: \"kubernetes.io/projected/c1fac181-ae33-45e1-8171-1d998d59bc04-kube-api-access-dr7cp\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.479333 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"c1fac181-ae33-45e1-8171-1d998d59bc04\") " pod="openstack/openstack-galera-0" Feb 28 09:18:31 crc kubenswrapper[4687]: I0228 09:18:31.577954 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.696666 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.700321 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.704913 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.705117 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.705237 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.706108 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pmgpk" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.711300 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.761821 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fe0178-db8f-44e3-9e53-a2450914080a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.761865 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.761923 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.761950 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.762007 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.762090 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnf8\" (UniqueName: \"kubernetes.io/projected/d1fe0178-db8f-44e3-9e53-a2450914080a-kube-api-access-rtnf8\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.762115 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1fe0178-db8f-44e3-9e53-a2450914080a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.762134 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fe0178-db8f-44e3-9e53-a2450914080a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863686 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1fe0178-db8f-44e3-9e53-a2450914080a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863736 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fe0178-db8f-44e3-9e53-a2450914080a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863783 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fe0178-db8f-44e3-9e53-a2450914080a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863801 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863841 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863863 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863893 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.863918 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnf8\" (UniqueName: \"kubernetes.io/projected/d1fe0178-db8f-44e3-9e53-a2450914080a-kube-api-access-rtnf8\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.864817 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.865183 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.865510 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1fe0178-db8f-44e3-9e53-a2450914080a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.865854 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.866436 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1fe0178-db8f-44e3-9e53-a2450914080a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.868903 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fe0178-db8f-44e3-9e53-a2450914080a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.879835 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1fe0178-db8f-44e3-9e53-a2450914080a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.883419 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnf8\" (UniqueName: \"kubernetes.io/projected/d1fe0178-db8f-44e3-9e53-a2450914080a-kube-api-access-rtnf8\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.883961 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1fe0178-db8f-44e3-9e53-a2450914080a\") " pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.954155 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.955120 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.957620 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.958084 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9jvfw" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.958870 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 28 09:18:32 crc kubenswrapper[4687]: I0228 09:18:32.965746 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.027825 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.033115 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" event={"ID":"f6bffbc2-0283-4286-9d05-2b60186e0740","Type":"ContainerStarted","Data":"1492ced404b22343cd4f039c88b82fddbbcf9fe9b50f77c35c8debfb4dc0b97d"} Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.072686 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.072833 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-config-data\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.072898 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-kolla-config\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.072970 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92sh5\" (UniqueName: \"kubernetes.io/projected/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-kube-api-access-92sh5\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.073053 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.174920 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-kolla-config\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.175048 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92sh5\" (UniqueName: \"kubernetes.io/projected/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-kube-api-access-92sh5\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.175089 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.175163 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.175225 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-config-data\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.175765 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-kolla-config\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.176179 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-config-data\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.178902 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.185433 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.194470 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92sh5\" (UniqueName: \"kubernetes.io/projected/48796fdd-f9c8-473a-b17f-c6da6d0ba3a5-kube-api-access-92sh5\") pod \"memcached-0\" (UID: \"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5\") " pod="openstack/memcached-0" Feb 28 09:18:33 crc kubenswrapper[4687]: I0228 09:18:33.267584 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.027140 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.029173 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.032591 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dglt6" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.037688 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.108826 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42x2f\" (UniqueName: \"kubernetes.io/projected/a75c27c0-aef6-4631-9a63-521ba7e5889c-kube-api-access-42x2f\") pod \"kube-state-metrics-0\" (UID: \"a75c27c0-aef6-4631-9a63-521ba7e5889c\") " pod="openstack/kube-state-metrics-0" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.209750 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42x2f\" (UniqueName: \"kubernetes.io/projected/a75c27c0-aef6-4631-9a63-521ba7e5889c-kube-api-access-42x2f\") pod \"kube-state-metrics-0\" (UID: \"a75c27c0-aef6-4631-9a63-521ba7e5889c\") " pod="openstack/kube-state-metrics-0" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.228270 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42x2f\" (UniqueName: \"kubernetes.io/projected/a75c27c0-aef6-4631-9a63-521ba7e5889c-kube-api-access-42x2f\") pod \"kube-state-metrics-0\" (UID: \"a75c27c0-aef6-4631-9a63-521ba7e5889c\") " pod="openstack/kube-state-metrics-0" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.361723 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 28 09:18:35 crc kubenswrapper[4687]: I0228 09:18:35.422999 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.284700 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-grkmn"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.285961 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.288007 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-llsd7" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.288443 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.288799 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.296681 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kbhr4"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.298346 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.324725 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kbhr4"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.338115 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-grkmn"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375037 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-run\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375088 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-run-ovn\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375115 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-lib\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375135 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7837572-8dcc-409d-b8fd-c37f2af52474-ovn-controller-tls-certs\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375170 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7837572-8dcc-409d-b8fd-c37f2af52474-scripts\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375274 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4lqv\" (UniqueName: \"kubernetes.io/projected/b7837572-8dcc-409d-b8fd-c37f2af52474-kube-api-access-c4lqv\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375297 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-run\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375315 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7837572-8dcc-409d-b8fd-c37f2af52474-combined-ca-bundle\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375329 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-log-ovn\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375430 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-etc-ovs\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375466 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce17423e-ccd3-4aad-9538-2424a822d5df-scripts\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375522 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzbz\" (UniqueName: \"kubernetes.io/projected/ce17423e-ccd3-4aad-9538-2424a822d5df-kube-api-access-7dzbz\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.375552 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-log\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.477622 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4lqv\" (UniqueName: \"kubernetes.io/projected/b7837572-8dcc-409d-b8fd-c37f2af52474-kube-api-access-c4lqv\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478381 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-run\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478445 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7837572-8dcc-409d-b8fd-c37f2af52474-combined-ca-bundle\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478474 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-log-ovn\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478538 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-etc-ovs\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478566 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce17423e-ccd3-4aad-9538-2424a822d5df-scripts\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478640 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzbz\" (UniqueName: \"kubernetes.io/projected/ce17423e-ccd3-4aad-9538-2424a822d5df-kube-api-access-7dzbz\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478668 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-log\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478735 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-run\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478787 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-run-ovn\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478822 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-lib\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478852 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7837572-8dcc-409d-b8fd-c37f2af52474-ovn-controller-tls-certs\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.478894 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7837572-8dcc-409d-b8fd-c37f2af52474-scripts\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.480059 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-log-ovn\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.480113 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-run\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.480258 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-lib\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.480286 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-var-log\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.480386 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-run-ovn\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.480773 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7837572-8dcc-409d-b8fd-c37f2af52474-var-run\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.484047 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce17423e-ccd3-4aad-9538-2424a822d5df-scripts\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.484162 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ce17423e-ccd3-4aad-9538-2424a822d5df-etc-ovs\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.484408 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7837572-8dcc-409d-b8fd-c37f2af52474-scripts\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.487089 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7837572-8dcc-409d-b8fd-c37f2af52474-ovn-controller-tls-certs\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.492103 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7837572-8dcc-409d-b8fd-c37f2af52474-combined-ca-bundle\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.494756 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzbz\" (UniqueName: \"kubernetes.io/projected/ce17423e-ccd3-4aad-9538-2424a822d5df-kube-api-access-7dzbz\") pod \"ovn-controller-ovs-kbhr4\" (UID: \"ce17423e-ccd3-4aad-9538-2424a822d5df\") " pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.507214 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4lqv\" (UniqueName: \"kubernetes.io/projected/b7837572-8dcc-409d-b8fd-c37f2af52474-kube-api-access-c4lqv\") pod \"ovn-controller-grkmn\" (UID: \"b7837572-8dcc-409d-b8fd-c37f2af52474\") " pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.603639 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-grkmn" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.615365 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.769011 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.779753 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.790990 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.791650 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.792287 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vnwdq" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.792364 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.792941 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.794440 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.895497 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.895589 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.895658 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-config\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.895753 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.895953 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.896128 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.896201 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqqz\" (UniqueName: \"kubernetes.io/projected/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-kube-api-access-6sqqz\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.896306 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997359 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997420 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997457 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqqz\" (UniqueName: \"kubernetes.io/projected/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-kube-api-access-6sqqz\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997477 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997519 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997541 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997570 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-config\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.997590 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.998136 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.998225 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.998870 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:38 crc kubenswrapper[4687]: I0228 09:18:38.998866 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-config\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.000972 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.002377 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.004713 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.012671 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqqz\" (UniqueName: \"kubernetes.io/projected/f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695-kube-api-access-6sqqz\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.017268 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695\") " pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.108501 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:39 crc kubenswrapper[4687]: W0228 09:18:39.204871 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1fac181_ae33_45e1_8171_1d998d59bc04.slice/crio-47d7802ba952db224a22e6d2ddb43a5188f7b4b15afe7ea8230dcd6271cbf841 WatchSource:0}: Error finding container 47d7802ba952db224a22e6d2ddb43a5188f7b4b15afe7ea8230dcd6271cbf841: Status 404 returned error can't find the container with id 47d7802ba952db224a22e6d2ddb43a5188f7b4b15afe7ea8230dcd6271cbf841 Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.580782 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-2qbft"] Feb 28 09:18:39 crc kubenswrapper[4687]: I0228 09:18:39.989862 4687 scope.go:117] "RemoveContainer" containerID="88cf144858170a73b2f4fcb48fce7f766c95fd90081524d373422e858474102b" Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.120854 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c1fac181-ae33-45e1-8171-1d998d59bc04","Type":"ContainerStarted","Data":"47d7802ba952db224a22e6d2ddb43a5188f7b4b15afe7ea8230dcd6271cbf841"} Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.123899 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" event={"ID":"f611fd7a-502d-4db5-ad7f-eae15ccd9486","Type":"ContainerStarted","Data":"a5a0afad058d51a498acc85fae711a8fd1f2f09354d1936afc7823a0c03dc65c"} Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.369960 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.375965 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.398754 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.402160 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 28 09:18:40 crc kubenswrapper[4687]: W0228 09:18:40.438045 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48796fdd_f9c8_473a_b17f_c6da6d0ba3a5.slice/crio-024de7524335711270d6b9bddc7ebef4dabc129e8267bd0a2713cb24a16b39fa WatchSource:0}: Error finding container 024de7524335711270d6b9bddc7ebef4dabc129e8267bd0a2713cb24a16b39fa: Status 404 returned error can't find the container with id 024de7524335711270d6b9bddc7ebef4dabc129e8267bd0a2713cb24a16b39fa Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.525707 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kbhr4"] Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.532122 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-grkmn"] Feb 28 09:18:40 crc kubenswrapper[4687]: W0228 09:18:40.537907 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7837572_8dcc_409d_b8fd_c37f2af52474.slice/crio-f28094a1b2d02fb6fd464cca4d2a07abe2cc9817ac21251a03fc72b060a2ee88 WatchSource:0}: Error finding container f28094a1b2d02fb6fd464cca4d2a07abe2cc9817ac21251a03fc72b060a2ee88: Status 404 returned error can't find the container with id f28094a1b2d02fb6fd464cca4d2a07abe2cc9817ac21251a03fc72b060a2ee88 Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.568416 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:18:40 crc kubenswrapper[4687]: W0228 09:18:40.580751 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75c27c0_aef6_4631_9a63_521ba7e5889c.slice/crio-b68ad094ec81285cd9113114d8962e1bc59b32b47951cc5e3ff499bdfc4fb5fc WatchSource:0}: Error finding container b68ad094ec81285cd9113114d8962e1bc59b32b47951cc5e3ff499bdfc4fb5fc: Status 404 returned error can't find the container with id b68ad094ec81285cd9113114d8962e1bc59b32b47951cc5e3ff499bdfc4fb5fc Feb 28 09:18:40 crc kubenswrapper[4687]: I0228 09:18:40.637649 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 28 09:18:40 crc kubenswrapper[4687]: W0228 09:18:40.641378 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ed6dc4_5a44_4cc0_9bc4_9f132aae1695.slice/crio-72914c124b18c7cb5e3b8ef96ae397011fa50c063a28525d6655b46630d0b4fe WatchSource:0}: Error finding container 72914c124b18c7cb5e3b8ef96ae397011fa50c063a28525d6655b46630d0b4fe: Status 404 returned error can't find the container with id 72914c124b18c7cb5e3b8ef96ae397011fa50c063a28525d6655b46630d0b4fe Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.079121 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-csrrp"] Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.080770 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.085422 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.094211 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-csrrp"] Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.162008 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f7bb81-e353-405c-9676-8a57d0886dae-config\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.162184 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f7bb81-e353-405c-9676-8a57d0886dae-combined-ca-bundle\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.162369 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d4f7bb81-e353-405c-9676-8a57d0886dae-ovs-rundir\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.162512 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4f7bb81-e353-405c-9676-8a57d0886dae-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.162642 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdf9\" (UniqueName: \"kubernetes.io/projected/d4f7bb81-e353-405c-9676-8a57d0886dae-kube-api-access-5tdf9\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.162791 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d4f7bb81-e353-405c-9676-8a57d0886dae-ovn-rundir\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.179817 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695","Type":"ContainerStarted","Data":"72914c124b18c7cb5e3b8ef96ae397011fa50c063a28525d6655b46630d0b4fe"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.191090 4687 generic.go:334] "Generic (PLEG): container finished" podID="d0151919-e58a-406d-939d-d88c8103e6f8" containerID="dadfc09c302543ae8814d85143b168c1f40dcf4303954da228152a489574d0a8" exitCode=0 Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.191181 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" event={"ID":"d0151919-e58a-406d-939d-d88c8103e6f8","Type":"ContainerDied","Data":"dadfc09c302543ae8814d85143b168c1f40dcf4303954da228152a489574d0a8"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.194933 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"541f5799-4b5e-4767-aca7-8c3738502a06","Type":"ContainerStarted","Data":"206e2422ee3b551de75917d879a6617d4a05b1f456afc649e089e1537fea3d4c"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.199508 4687 generic.go:334] "Generic (PLEG): container finished" podID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerID="b295fbf01e0a1931dead7c7a3d99c745155d27959cf4ba695b0b34ee6b59fdeb" exitCode=0 Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.199584 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" event={"ID":"f611fd7a-502d-4db5-ad7f-eae15ccd9486","Type":"ContainerDied","Data":"b295fbf01e0a1931dead7c7a3d99c745155d27959cf4ba695b0b34ee6b59fdeb"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.207402 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-qzps4"] Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.215394 4687 generic.go:334] "Generic (PLEG): container finished" podID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerID="46d18a03e96e176491ee7e5b14e4ec9e06b5e18d5b539ce140fd4fa0796a92dd" exitCode=0 Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.215503 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" event={"ID":"f6bffbc2-0283-4286-9d05-2b60186e0740","Type":"ContainerDied","Data":"46d18a03e96e176491ee7e5b14e4ec9e06b5e18d5b539ce140fd4fa0796a92dd"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.220821 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kbhr4" event={"ID":"ce17423e-ccd3-4aad-9538-2424a822d5df","Type":"ContainerStarted","Data":"dc5be35a108250a58725ba7699be7d7ce73a9cd446e1595a830749c0073881a3"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.226324 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-grkmn" event={"ID":"b7837572-8dcc-409d-b8fd-c37f2af52474","Type":"ContainerStarted","Data":"f28094a1b2d02fb6fd464cca4d2a07abe2cc9817ac21251a03fc72b060a2ee88"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.227539 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a75c27c0-aef6-4631-9a63-521ba7e5889c","Type":"ContainerStarted","Data":"b68ad094ec81285cd9113114d8962e1bc59b32b47951cc5e3ff499bdfc4fb5fc"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.235509 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5","Type":"ContainerStarted","Data":"024de7524335711270d6b9bddc7ebef4dabc129e8267bd0a2713cb24a16b39fa"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.236655 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1fe0178-db8f-44e3-9e53-a2450914080a","Type":"ContainerStarted","Data":"b5769a8ec6f411608ce48b7825a60fc72052e2b247cc06cbf49b07cb1eb7c15f"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.238051 4687 generic.go:334] "Generic (PLEG): container finished" podID="e5cd53cf-b205-4c6c-92be-155def921e74" containerID="0de47c9b5d15a0a68f5b2639550d89c859455526bfc116eb9b9be7d6b7b5344c" exitCode=0 Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.238093 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" event={"ID":"e5cd53cf-b205-4c6c-92be-155def921e74","Type":"ContainerDied","Data":"0de47c9b5d15a0a68f5b2639550d89c859455526bfc116eb9b9be7d6b7b5344c"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.244482 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"171eb8fe-deaf-4936-b51d-de02b4131b8b","Type":"ContainerStarted","Data":"c835050cda7388df5c0329a89bc25e1c3f3497740cfe7d7c1128fb951745ab22"} Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.266880 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d4f7bb81-e353-405c-9676-8a57d0886dae-ovs-rundir\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.266925 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4f7bb81-e353-405c-9676-8a57d0886dae-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.267009 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdf9\" (UniqueName: \"kubernetes.io/projected/d4f7bb81-e353-405c-9676-8a57d0886dae-kube-api-access-5tdf9\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.267153 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d4f7bb81-e353-405c-9676-8a57d0886dae-ovn-rundir\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.267233 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f7bb81-e353-405c-9676-8a57d0886dae-config\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.267267 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f7bb81-e353-405c-9676-8a57d0886dae-combined-ca-bundle\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.267375 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d4f7bb81-e353-405c-9676-8a57d0886dae-ovs-rundir\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.268942 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d4f7bb81-e353-405c-9676-8a57d0886dae-ovn-rundir\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.269832 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f7bb81-e353-405c-9676-8a57d0886dae-config\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.278229 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f7bb81-e353-405c-9676-8a57d0886dae-combined-ca-bundle\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.280683 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4f7bb81-e353-405c-9676-8a57d0886dae-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.284323 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdf9\" (UniqueName: \"kubernetes.io/projected/d4f7bb81-e353-405c-9676-8a57d0886dae-kube-api-access-5tdf9\") pod \"ovn-controller-metrics-csrrp\" (UID: \"d4f7bb81-e353-405c-9676-8a57d0886dae\") " pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.303555 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-lz77z"] Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.305725 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.307569 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.311008 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-lz77z"] Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.369197 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2k9\" (UniqueName: \"kubernetes.io/projected/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-kube-api-access-7r2k9\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.369274 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-config\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.369293 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-dns-svc\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.369311 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.420353 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-csrrp" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.472807 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-config\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.472844 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-dns-svc\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.472864 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.472970 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2k9\" (UniqueName: \"kubernetes.io/projected/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-kube-api-access-7r2k9\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.473907 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-config\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.473948 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-dns-svc\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.474140 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.507595 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2k9\" (UniqueName: \"kubernetes.io/projected/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-kube-api-access-7r2k9\") pod \"dnsmasq-dns-6444958b7f-lz77z\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.625995 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.988844 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:41 crc kubenswrapper[4687]: I0228 09:18:41.996331 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.083396 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrqz4\" (UniqueName: \"kubernetes.io/projected/d0151919-e58a-406d-939d-d88c8103e6f8-kube-api-access-qrqz4\") pod \"d0151919-e58a-406d-939d-d88c8103e6f8\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.083473 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlwb4\" (UniqueName: \"kubernetes.io/projected/e5cd53cf-b205-4c6c-92be-155def921e74-kube-api-access-jlwb4\") pod \"e5cd53cf-b205-4c6c-92be-155def921e74\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.083623 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0151919-e58a-406d-939d-d88c8103e6f8-config\") pod \"d0151919-e58a-406d-939d-d88c8103e6f8\" (UID: \"d0151919-e58a-406d-939d-d88c8103e6f8\") " Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.083759 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-dns-svc\") pod \"e5cd53cf-b205-4c6c-92be-155def921e74\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.083788 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-config\") pod \"e5cd53cf-b205-4c6c-92be-155def921e74\" (UID: \"e5cd53cf-b205-4c6c-92be-155def921e74\") " Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.086448 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0151919-e58a-406d-939d-d88c8103e6f8-kube-api-access-qrqz4" (OuterVolumeSpecName: "kube-api-access-qrqz4") pod "d0151919-e58a-406d-939d-d88c8103e6f8" (UID: "d0151919-e58a-406d-939d-d88c8103e6f8"). InnerVolumeSpecName "kube-api-access-qrqz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.088101 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5cd53cf-b205-4c6c-92be-155def921e74-kube-api-access-jlwb4" (OuterVolumeSpecName: "kube-api-access-jlwb4") pod "e5cd53cf-b205-4c6c-92be-155def921e74" (UID: "e5cd53cf-b205-4c6c-92be-155def921e74"). InnerVolumeSpecName "kube-api-access-jlwb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.101257 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0151919-e58a-406d-939d-d88c8103e6f8-config" (OuterVolumeSpecName: "config") pod "d0151919-e58a-406d-939d-d88c8103e6f8" (UID: "d0151919-e58a-406d-939d-d88c8103e6f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.104109 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-config" (OuterVolumeSpecName: "config") pod "e5cd53cf-b205-4c6c-92be-155def921e74" (UID: "e5cd53cf-b205-4c6c-92be-155def921e74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.104420 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5cd53cf-b205-4c6c-92be-155def921e74" (UID: "e5cd53cf-b205-4c6c-92be-155def921e74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.185491 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.185520 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5cd53cf-b205-4c6c-92be-155def921e74-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.185551 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrqz4\" (UniqueName: \"kubernetes.io/projected/d0151919-e58a-406d-939d-d88c8103e6f8-kube-api-access-qrqz4\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.185564 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlwb4\" (UniqueName: \"kubernetes.io/projected/e5cd53cf-b205-4c6c-92be-155def921e74-kube-api-access-jlwb4\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.185573 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0151919-e58a-406d-939d-d88c8103e6f8-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.258738 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" event={"ID":"d0151919-e58a-406d-939d-d88c8103e6f8","Type":"ContainerDied","Data":"013ba73f46b43155d8989e1496589efacfad762a3193ca35051d097c6558f997"} Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.258807 4687 scope.go:117] "RemoveContainer" containerID="dadfc09c302543ae8814d85143b168c1f40dcf4303954da228152a489574d0a8" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.258805 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-9zft7" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.260893 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" event={"ID":"e5cd53cf-b205-4c6c-92be-155def921e74","Type":"ContainerDied","Data":"3461489647dfeabdee2c2b6cc6f95bbf08c37ac56e129c63fa811ccfb12f10a6"} Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.261001 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-l2vmx" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.304418 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-9zft7"] Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.308102 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-9zft7"] Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.347956 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-l2vmx"] Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.354611 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-l2vmx"] Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.552365 4687 scope.go:117] "RemoveContainer" containerID="0de47c9b5d15a0a68f5b2639550d89c859455526bfc116eb9b9be7d6b7b5344c" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.669479 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0151919-e58a-406d-939d-d88c8103e6f8" path="/var/lib/kubelet/pods/d0151919-e58a-406d-939d-d88c8103e6f8/volumes" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.670094 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5cd53cf-b205-4c6c-92be-155def921e74" path="/var/lib/kubelet/pods/e5cd53cf-b205-4c6c-92be-155def921e74/volumes" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.752045 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-csrrp"] Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.855494 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-lz77z"] Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.876572 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 28 09:18:42 crc kubenswrapper[4687]: E0228 09:18:42.876888 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5cd53cf-b205-4c6c-92be-155def921e74" containerName="init" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.876901 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5cd53cf-b205-4c6c-92be-155def921e74" containerName="init" Feb 28 09:18:42 crc kubenswrapper[4687]: E0228 09:18:42.876932 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0151919-e58a-406d-939d-d88c8103e6f8" containerName="init" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.876938 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0151919-e58a-406d-939d-d88c8103e6f8" containerName="init" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.877104 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5cd53cf-b205-4c6c-92be-155def921e74" containerName="init" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.877126 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0151919-e58a-406d-939d-d88c8103e6f8" containerName="init" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.877916 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.884094 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.884186 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.884322 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nplqq" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.884403 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 28 09:18:42 crc kubenswrapper[4687]: I0228 09:18:42.891799 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 28 09:18:42 crc kubenswrapper[4687]: W0228 09:18:42.943660 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4f7bb81_e353_405c_9676_8a57d0886dae.slice/crio-619658dd4d99fbb3d8ff344d29da8c6bf42ed1d70c147560dfc5aaba380a4f72 WatchSource:0}: Error finding container 619658dd4d99fbb3d8ff344d29da8c6bf42ed1d70c147560dfc5aaba380a4f72: Status 404 returned error can't find the container with id 619658dd4d99fbb3d8ff344d29da8c6bf42ed1d70c147560dfc5aaba380a4f72 Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011378 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcb66eab-811b-4162-a74b-2fc36e9e51b5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011439 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb66eab-811b-4162-a74b-2fc36e9e51b5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011478 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011558 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011602 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011636 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011746 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb66eab-811b-4162-a74b-2fc36e9e51b5-config\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.011765 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szhv7\" (UniqueName: \"kubernetes.io/projected/dcb66eab-811b-4162-a74b-2fc36e9e51b5-kube-api-access-szhv7\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: W0228 09:18:43.082817 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceb60d0e_2588_4d0d_abf1_afef4e684fc1.slice/crio-bdaacca8d54153aad67355fc4e5c1659dfc34f8e1e63794542ce3b9d26809ea6 WatchSource:0}: Error finding container bdaacca8d54153aad67355fc4e5c1659dfc34f8e1e63794542ce3b9d26809ea6: Status 404 returned error can't find the container with id bdaacca8d54153aad67355fc4e5c1659dfc34f8e1e63794542ce3b9d26809ea6 Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113426 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113476 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113545 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113606 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb66eab-811b-4162-a74b-2fc36e9e51b5-config\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113627 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szhv7\" (UniqueName: \"kubernetes.io/projected/dcb66eab-811b-4162-a74b-2fc36e9e51b5-kube-api-access-szhv7\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113653 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcb66eab-811b-4162-a74b-2fc36e9e51b5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113702 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb66eab-811b-4162-a74b-2fc36e9e51b5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113759 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.113867 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.114570 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcb66eab-811b-4162-a74b-2fc36e9e51b5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.115334 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb66eab-811b-4162-a74b-2fc36e9e51b5-config\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.115734 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb66eab-811b-4162-a74b-2fc36e9e51b5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.120796 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.120960 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.125580 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb66eab-811b-4162-a74b-2fc36e9e51b5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.131271 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.131796 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szhv7\" (UniqueName: \"kubernetes.io/projected/dcb66eab-811b-4162-a74b-2fc36e9e51b5-kube-api-access-szhv7\") pod \"ovsdbserver-sb-0\" (UID: \"dcb66eab-811b-4162-a74b-2fc36e9e51b5\") " pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.204858 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.271835 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" event={"ID":"f6bffbc2-0283-4286-9d05-2b60186e0740","Type":"ContainerStarted","Data":"7c55efd320795420be7cc7c1d6573d8dd23409c9d315a7a2e7569ae939b2e393"} Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.271974 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerName="dnsmasq-dns" containerID="cri-o://7c55efd320795420be7cc7c1d6573d8dd23409c9d315a7a2e7569ae939b2e393" gracePeriod=10 Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.272052 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.276002 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-csrrp" event={"ID":"d4f7bb81-e353-405c-9676-8a57d0886dae","Type":"ContainerStarted","Data":"619658dd4d99fbb3d8ff344d29da8c6bf42ed1d70c147560dfc5aaba380a4f72"} Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.280920 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" event={"ID":"f611fd7a-502d-4db5-ad7f-eae15ccd9486","Type":"ContainerStarted","Data":"84a856ce3dc0520d63c5594629f7656ad21f3e778d8f7266847a6f325e3404be"} Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.281064 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.300555 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" event={"ID":"ceb60d0e-2588-4d0d-abf1-afef4e684fc1","Type":"ContainerStarted","Data":"bdaacca8d54153aad67355fc4e5c1659dfc34f8e1e63794542ce3b9d26809ea6"} Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.308479 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" podStartSLOduration=7.7512842840000005 podStartE2EDuration="15.308464663s" podCreationTimestamp="2026-02-28 09:18:28 +0000 UTC" firstStartedPulling="2026-02-28 09:18:32.525692542 +0000 UTC m=+904.216261879" lastFinishedPulling="2026-02-28 09:18:40.082872921 +0000 UTC m=+911.773442258" observedRunningTime="2026-02-28 09:18:43.301811068 +0000 UTC m=+914.992380426" watchObservedRunningTime="2026-02-28 09:18:43.308464663 +0000 UTC m=+914.999033990" Feb 28 09:18:43 crc kubenswrapper[4687]: I0228 09:18:43.329323 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" podStartSLOduration=13.821702248 podStartE2EDuration="14.32931506s" podCreationTimestamp="2026-02-28 09:18:29 +0000 UTC" firstStartedPulling="2026-02-28 09:18:39.901184849 +0000 UTC m=+911.591754187" lastFinishedPulling="2026-02-28 09:18:40.408797663 +0000 UTC m=+912.099366999" observedRunningTime="2026-02-28 09:18:43.327453728 +0000 UTC m=+915.018023085" watchObservedRunningTime="2026-02-28 09:18:43.32931506 +0000 UTC m=+915.019884397" Feb 28 09:18:44 crc kubenswrapper[4687]: I0228 09:18:44.313438 4687 generic.go:334] "Generic (PLEG): container finished" podID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerID="7c55efd320795420be7cc7c1d6573d8dd23409c9d315a7a2e7569ae939b2e393" exitCode=0 Feb 28 09:18:44 crc kubenswrapper[4687]: I0228 09:18:44.313525 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" event={"ID":"f6bffbc2-0283-4286-9d05-2b60186e0740","Type":"ContainerDied","Data":"7c55efd320795420be7cc7c1d6573d8dd23409c9d315a7a2e7569ae939b2e393"} Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.630445 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.688098 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-config\") pod \"f6bffbc2-0283-4286-9d05-2b60186e0740\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.688253 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-dns-svc\") pod \"f6bffbc2-0283-4286-9d05-2b60186e0740\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.688394 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr95z\" (UniqueName: \"kubernetes.io/projected/f6bffbc2-0283-4286-9d05-2b60186e0740-kube-api-access-zr95z\") pod \"f6bffbc2-0283-4286-9d05-2b60186e0740\" (UID: \"f6bffbc2-0283-4286-9d05-2b60186e0740\") " Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.694682 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bffbc2-0283-4286-9d05-2b60186e0740-kube-api-access-zr95z" (OuterVolumeSpecName: "kube-api-access-zr95z") pod "f6bffbc2-0283-4286-9d05-2b60186e0740" (UID: "f6bffbc2-0283-4286-9d05-2b60186e0740"). InnerVolumeSpecName "kube-api-access-zr95z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.722401 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-config" (OuterVolumeSpecName: "config") pod "f6bffbc2-0283-4286-9d05-2b60186e0740" (UID: "f6bffbc2-0283-4286-9d05-2b60186e0740"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.733774 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f6bffbc2-0283-4286-9d05-2b60186e0740" (UID: "f6bffbc2-0283-4286-9d05-2b60186e0740"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.791307 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr95z\" (UniqueName: \"kubernetes.io/projected/f6bffbc2-0283-4286-9d05-2b60186e0740-kube-api-access-zr95z\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.791341 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:46 crc kubenswrapper[4687]: I0228 09:18:46.791355 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bffbc2-0283-4286-9d05-2b60186e0740-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:47 crc kubenswrapper[4687]: I0228 09:18:47.337634 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" event={"ID":"f6bffbc2-0283-4286-9d05-2b60186e0740","Type":"ContainerDied","Data":"1492ced404b22343cd4f039c88b82fddbbcf9fe9b50f77c35c8debfb4dc0b97d"} Feb 28 09:18:47 crc kubenswrapper[4687]: I0228 09:18:47.337692 4687 scope.go:117] "RemoveContainer" containerID="7c55efd320795420be7cc7c1d6573d8dd23409c9d315a7a2e7569ae939b2e393" Feb 28 09:18:47 crc kubenswrapper[4687]: I0228 09:18:47.337704 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-qzps4" Feb 28 09:18:47 crc kubenswrapper[4687]: I0228 09:18:47.363946 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-qzps4"] Feb 28 09:18:47 crc kubenswrapper[4687]: I0228 09:18:47.369612 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-qzps4"] Feb 28 09:18:48 crc kubenswrapper[4687]: I0228 09:18:48.672507 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" path="/var/lib/kubelet/pods/f6bffbc2-0283-4286-9d05-2b60186e0740/volumes" Feb 28 09:18:49 crc kubenswrapper[4687]: I0228 09:18:49.436406 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:18:49 crc kubenswrapper[4687]: I0228 09:18:49.964048 4687 scope.go:117] "RemoveContainer" containerID="46d18a03e96e176491ee7e5b14e4ec9e06b5e18d5b539ce140fd4fa0796a92dd" Feb 28 09:18:50 crc kubenswrapper[4687]: I0228 09:18:50.380523 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" event={"ID":"ceb60d0e-2588-4d0d-abf1-afef4e684fc1","Type":"ContainerStarted","Data":"74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a"} Feb 28 09:18:50 crc kubenswrapper[4687]: I0228 09:18:50.435467 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.399249 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"48796fdd-f9c8-473a-b17f-c6da6d0ba3a5","Type":"ContainerStarted","Data":"24793ec00ab9dc25be7c154fbd4aa7a30703ed95bcfdc6d44235a3036c49ffc6"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.399908 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.401037 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c1fac181-ae33-45e1-8171-1d998d59bc04","Type":"ContainerStarted","Data":"c0a63dcd1e882d2c876c3093736b678b535b0514101340a81af2ec6e3c1d7c0d"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.403579 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-grkmn" event={"ID":"b7837572-8dcc-409d-b8fd-c37f2af52474","Type":"ContainerStarted","Data":"afdb6a11441a1738dc0bf9a4e10a44c4ed0c39d599b7cac692389cae0b893d09"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.404220 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-grkmn" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.405440 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a75c27c0-aef6-4631-9a63-521ba7e5889c","Type":"ContainerStarted","Data":"61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.405920 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.407639 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dcb66eab-811b-4162-a74b-2fc36e9e51b5","Type":"ContainerStarted","Data":"07c6a09cdae3df952aab010190b1369f3e2563effb382337652865d2ddd1c4e0"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.409333 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1fe0178-db8f-44e3-9e53-a2450914080a","Type":"ContainerStarted","Data":"4e4fab46e900e5dec9d0b762622b5bcdb925ec4e87eaa11e25ca363045a5a4e6"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.414172 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kbhr4" event={"ID":"ce17423e-ccd3-4aad-9538-2424a822d5df","Type":"ContainerStarted","Data":"5b62fb436068dec943f609e9c6a779ac2e458b2748b0a0f02df3f4d1fa860a89"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.415607 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-csrrp" event={"ID":"d4f7bb81-e353-405c-9676-8a57d0886dae","Type":"ContainerStarted","Data":"698eda92874d70dbf2b0f7347fd25d59dc9335aab689db08959f544ad65494db"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.418016 4687 generic.go:334] "Generic (PLEG): container finished" podID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerID="74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a" exitCode=0 Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.418084 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" event={"ID":"ceb60d0e-2588-4d0d-abf1-afef4e684fc1","Type":"ContainerDied","Data":"74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a"} Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.439524 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.357777676 podStartE2EDuration="19.43950735s" podCreationTimestamp="2026-02-28 09:18:32 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.443238391 +0000 UTC m=+912.133807728" lastFinishedPulling="2026-02-28 09:18:49.524968065 +0000 UTC m=+921.215537402" observedRunningTime="2026-02-28 09:18:51.434692915 +0000 UTC m=+923.125262251" watchObservedRunningTime="2026-02-28 09:18:51.43950735 +0000 UTC m=+923.130076687" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.571852 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=7.201779184 podStartE2EDuration="16.571832125s" podCreationTimestamp="2026-02-28 09:18:35 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.58359376 +0000 UTC m=+912.274163098" lastFinishedPulling="2026-02-28 09:18:49.953646703 +0000 UTC m=+921.644216039" observedRunningTime="2026-02-28 09:18:51.553328562 +0000 UTC m=+923.243897909" watchObservedRunningTime="2026-02-28 09:18:51.571832125 +0000 UTC m=+923.262401462" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.574760 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-grkmn" podStartSLOduration=4.141616557 podStartE2EDuration="13.574746698s" podCreationTimestamp="2026-02-28 09:18:38 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.545846483 +0000 UTC m=+912.236415820" lastFinishedPulling="2026-02-28 09:18:49.978976624 +0000 UTC m=+921.669545961" observedRunningTime="2026-02-28 09:18:51.570541719 +0000 UTC m=+923.261111056" watchObservedRunningTime="2026-02-28 09:18:51.574746698 +0000 UTC m=+923.265316034" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.586317 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-csrrp" podStartSLOduration=3.188279809 podStartE2EDuration="10.586299086s" podCreationTimestamp="2026-02-28 09:18:41 +0000 UTC" firstStartedPulling="2026-02-28 09:18:42.947465932 +0000 UTC m=+914.638035269" lastFinishedPulling="2026-02-28 09:18:50.345485209 +0000 UTC m=+922.036054546" observedRunningTime="2026-02-28 09:18:51.580324618 +0000 UTC m=+923.270893955" watchObservedRunningTime="2026-02-28 09:18:51.586299086 +0000 UTC m=+923.276868423" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.899387 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-lz77z"] Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.955219 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-l4xdl"] Feb 28 09:18:51 crc kubenswrapper[4687]: E0228 09:18:51.955583 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerName="init" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.955600 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerName="init" Feb 28 09:18:51 crc kubenswrapper[4687]: E0228 09:18:51.955619 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerName="dnsmasq-dns" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.955625 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerName="dnsmasq-dns" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.955788 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bffbc2-0283-4286-9d05-2b60186e0740" containerName="dnsmasq-dns" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.963938 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.967064 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 28 09:18:51 crc kubenswrapper[4687]: I0228 09:18:51.980941 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-l4xdl"] Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.007058 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.007290 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.007342 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-config\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.007453 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.007578 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84hqq\" (UniqueName: \"kubernetes.io/projected/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-kube-api-access-84hqq\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.109769 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.109853 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84hqq\" (UniqueName: \"kubernetes.io/projected/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-kube-api-access-84hqq\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.109938 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.109969 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.110010 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-config\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.110844 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-config\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.111931 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.112694 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.113783 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.148641 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84hqq\" (UniqueName: \"kubernetes.io/projected/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-kube-api-access-84hqq\") pod \"dnsmasq-dns-7b57d9888c-l4xdl\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.313959 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.440625 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695","Type":"ContainerStarted","Data":"fe033a935c10d57ecf5d1691359c030bfe451fa954eff31a9deb6145917d06d7"} Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.440913 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695","Type":"ContainerStarted","Data":"224c7554764bafa6b02467491788b4c9fe8884ae56fcb463c0fcd4da9f93d9af"} Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.444203 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"541f5799-4b5e-4767-aca7-8c3738502a06","Type":"ContainerStarted","Data":"fc6036d26129118d267b8cba85e86a89e8d8f4544e9f0c7c8c7911aa86fdebc9"} Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.446295 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" event={"ID":"ceb60d0e-2588-4d0d-abf1-afef4e684fc1","Type":"ContainerStarted","Data":"e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339"} Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.446412 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerName="dnsmasq-dns" containerID="cri-o://e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339" gracePeriod=10 Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.446424 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.451409 4687 generic.go:334] "Generic (PLEG): container finished" podID="ce17423e-ccd3-4aad-9538-2424a822d5df" containerID="5b62fb436068dec943f609e9c6a779ac2e458b2748b0a0f02df3f4d1fa860a89" exitCode=0 Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.451634 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kbhr4" event={"ID":"ce17423e-ccd3-4aad-9538-2424a822d5df","Type":"ContainerDied","Data":"5b62fb436068dec943f609e9c6a779ac2e458b2748b0a0f02df3f4d1fa860a89"} Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.453835 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"171eb8fe-deaf-4936-b51d-de02b4131b8b","Type":"ContainerStarted","Data":"6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068"} Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.466627 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.797970752 podStartE2EDuration="15.466605576s" podCreationTimestamp="2026-02-28 09:18:37 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.643738325 +0000 UTC m=+912.334307662" lastFinishedPulling="2026-02-28 09:18:50.312373149 +0000 UTC m=+922.002942486" observedRunningTime="2026-02-28 09:18:52.463682428 +0000 UTC m=+924.154251764" watchObservedRunningTime="2026-02-28 09:18:52.466605576 +0000 UTC m=+924.157174913" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.530556 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" podStartSLOduration=11.530536952 podStartE2EDuration="11.530536952s" podCreationTimestamp="2026-02-28 09:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:18:52.516164439 +0000 UTC m=+924.206733776" watchObservedRunningTime="2026-02-28 09:18:52.530536952 +0000 UTC m=+924.221106289" Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.793031 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-l4xdl"] Feb 28 09:18:52 crc kubenswrapper[4687]: I0228 09:18:52.925346 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.025791 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-ovsdbserver-nb\") pod \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.025874 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r2k9\" (UniqueName: \"kubernetes.io/projected/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-kube-api-access-7r2k9\") pod \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.026125 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-dns-svc\") pod \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.026202 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-config\") pod \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\" (UID: \"ceb60d0e-2588-4d0d-abf1-afef4e684fc1\") " Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.029331 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-kube-api-access-7r2k9" (OuterVolumeSpecName: "kube-api-access-7r2k9") pod "ceb60d0e-2588-4d0d-abf1-afef4e684fc1" (UID: "ceb60d0e-2588-4d0d-abf1-afef4e684fc1"). InnerVolumeSpecName "kube-api-access-7r2k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.053948 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ceb60d0e-2588-4d0d-abf1-afef4e684fc1" (UID: "ceb60d0e-2588-4d0d-abf1-afef4e684fc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.055924 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ceb60d0e-2588-4d0d-abf1-afef4e684fc1" (UID: "ceb60d0e-2588-4d0d-abf1-afef4e684fc1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.056528 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-config" (OuterVolumeSpecName: "config") pod "ceb60d0e-2588-4d0d-abf1-afef4e684fc1" (UID: "ceb60d0e-2588-4d0d-abf1-afef4e684fc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.128985 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.129042 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r2k9\" (UniqueName: \"kubernetes.io/projected/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-kube-api-access-7r2k9\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.129055 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.129068 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb60d0e-2588-4d0d-abf1-afef4e684fc1-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.465036 4687 generic.go:334] "Generic (PLEG): container finished" podID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerID="e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339" exitCode=0 Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.465098 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" event={"ID":"ceb60d0e-2588-4d0d-abf1-afef4e684fc1","Type":"ContainerDied","Data":"e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.465145 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.466262 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-lz77z" event={"ID":"ceb60d0e-2588-4d0d-abf1-afef4e684fc1","Type":"ContainerDied","Data":"bdaacca8d54153aad67355fc4e5c1659dfc34f8e1e63794542ce3b9d26809ea6"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.466313 4687 scope.go:117] "RemoveContainer" containerID="e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.469233 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dcb66eab-811b-4162-a74b-2fc36e9e51b5","Type":"ContainerStarted","Data":"4c8fc81babba91449e4013257987a17ddada38b3f0c289ae916dcfe42f818613"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.469367 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dcb66eab-811b-4162-a74b-2fc36e9e51b5","Type":"ContainerStarted","Data":"de25da88b9943744c5094906320564019fafab5f150dc0c65e07f8a7ffcb8a92"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.471783 4687 generic.go:334] "Generic (PLEG): container finished" podID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerID="d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca" exitCode=0 Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.471874 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" event={"ID":"53ebc6fa-0ccc-410b-9d22-5e5e978da47b","Type":"ContainerDied","Data":"d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.471903 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" event={"ID":"53ebc6fa-0ccc-410b-9d22-5e5e978da47b","Type":"ContainerStarted","Data":"5cb040e7890419e9bf3204e7c7fdad2816334174755a60c874d1182b656d1b6f"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.480267 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kbhr4" event={"ID":"ce17423e-ccd3-4aad-9538-2424a822d5df","Type":"ContainerStarted","Data":"a859b28eda0213186697c93053f0e5ef1ed80b3b54346fedb6f2d14dffe16779"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.480326 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kbhr4" event={"ID":"ce17423e-ccd3-4aad-9538-2424a822d5df","Type":"ContainerStarted","Data":"11644decce4a979d96005ca2f0835a929b2ca4752ea799e219ea3c337740b0fe"} Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.481254 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.484178 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.487098 4687 scope.go:117] "RemoveContainer" containerID="74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.506699 4687 scope.go:117] "RemoveContainer" containerID="e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339" Feb 28 09:18:53 crc kubenswrapper[4687]: E0228 09:18:53.507240 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339\": container with ID starting with e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339 not found: ID does not exist" containerID="e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.507294 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339"} err="failed to get container status \"e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339\": rpc error: code = NotFound desc = could not find container \"e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339\": container with ID starting with e9c0b3b1c1336317f7565006a1224755bc86cb90aa47b9233738c9bc40b8d339 not found: ID does not exist" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.507330 4687 scope.go:117] "RemoveContainer" containerID="74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a" Feb 28 09:18:53 crc kubenswrapper[4687]: E0228 09:18:53.507714 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a\": container with ID starting with 74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a not found: ID does not exist" containerID="74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.507767 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a"} err="failed to get container status \"74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a\": rpc error: code = NotFound desc = could not find container \"74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a\": container with ID starting with 74fd66f7d80ae36c20736b4d0bee519cc20c9029c028818f66d48b58ddadf72a not found: ID does not exist" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.508866 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.586009842 podStartE2EDuration="12.508849048s" podCreationTimestamp="2026-02-28 09:18:41 +0000 UTC" firstStartedPulling="2026-02-28 09:18:50.469248471 +0000 UTC m=+922.159817808" lastFinishedPulling="2026-02-28 09:18:52.392087677 +0000 UTC m=+924.082657014" observedRunningTime="2026-02-28 09:18:53.494280236 +0000 UTC m=+925.184849573" watchObservedRunningTime="2026-02-28 09:18:53.508849048 +0000 UTC m=+925.199418385" Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.540485 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-lz77z"] Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.546977 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-lz77z"] Feb 28 09:18:53 crc kubenswrapper[4687]: I0228 09:18:53.557055 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kbhr4" podStartSLOduration=6.5690678909999995 podStartE2EDuration="15.557036301s" podCreationTimestamp="2026-02-28 09:18:38 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.535917267 +0000 UTC m=+912.226486605" lastFinishedPulling="2026-02-28 09:18:49.523885678 +0000 UTC m=+921.214455015" observedRunningTime="2026-02-28 09:18:53.546740086 +0000 UTC m=+925.237309433" watchObservedRunningTime="2026-02-28 09:18:53.557036301 +0000 UTC m=+925.247605629" Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.109401 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.109838 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.141229 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.488804 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" event={"ID":"53ebc6fa-0ccc-410b-9d22-5e5e978da47b","Type":"ContainerStarted","Data":"31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a"} Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.489041 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.512480 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" podStartSLOduration=3.512458204 podStartE2EDuration="3.512458204s" podCreationTimestamp="2026-02-28 09:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:18:54.508676011 +0000 UTC m=+926.199245348" watchObservedRunningTime="2026-02-28 09:18:54.512458204 +0000 UTC m=+926.203027542" Feb 28 09:18:54 crc kubenswrapper[4687]: I0228 09:18:54.667640 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" path="/var/lib/kubelet/pods/ceb60d0e-2588-4d0d-abf1-afef4e684fc1/volumes" Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.002120 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.002483 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.206006 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.240175 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.396731 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.500902 4687 generic.go:334] "Generic (PLEG): container finished" podID="d1fe0178-db8f-44e3-9e53-a2450914080a" containerID="4e4fab46e900e5dec9d0b762622b5bcdb925ec4e87eaa11e25ca363045a5a4e6" exitCode=0 Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.501057 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1fe0178-db8f-44e3-9e53-a2450914080a","Type":"ContainerDied","Data":"4e4fab46e900e5dec9d0b762622b5bcdb925ec4e87eaa11e25ca363045a5a4e6"} Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.504937 4687 generic.go:334] "Generic (PLEG): container finished" podID="c1fac181-ae33-45e1-8171-1d998d59bc04" containerID="c0a63dcd1e882d2c876c3093736b678b535b0514101340a81af2ec6e3c1d7c0d" exitCode=0 Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.505094 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c1fac181-ae33-45e1-8171-1d998d59bc04","Type":"ContainerDied","Data":"c0a63dcd1e882d2c876c3093736b678b535b0514101340a81af2ec6e3c1d7c0d"} Feb 28 09:18:55 crc kubenswrapper[4687]: I0228 09:18:55.505710 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:56 crc kubenswrapper[4687]: I0228 09:18:56.515495 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1fe0178-db8f-44e3-9e53-a2450914080a","Type":"ContainerStarted","Data":"4195560720ae5edbc58cb41a5a049dc0ff38ab543f07c8e7b7b877786237f694"} Feb 28 09:18:56 crc kubenswrapper[4687]: I0228 09:18:56.518092 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c1fac181-ae33-45e1-8171-1d998d59bc04","Type":"ContainerStarted","Data":"c5e06658579482834fdbcc6f4df465e78a05881ee15819206e174972effd99ed"} Feb 28 09:18:56 crc kubenswrapper[4687]: I0228 09:18:56.544177 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=15.702703988 podStartE2EDuration="25.544157051s" podCreationTimestamp="2026-02-28 09:18:31 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.444295459 +0000 UTC m=+912.134864796" lastFinishedPulling="2026-02-28 09:18:50.285748522 +0000 UTC m=+921.976317859" observedRunningTime="2026-02-28 09:18:56.537343326 +0000 UTC m=+928.227912662" watchObservedRunningTime="2026-02-28 09:18:56.544157051 +0000 UTC m=+928.234726388" Feb 28 09:18:56 crc kubenswrapper[4687]: I0228 09:18:56.558694 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=16.249712702 podStartE2EDuration="26.558674376s" podCreationTimestamp="2026-02-28 09:18:30 +0000 UTC" firstStartedPulling="2026-02-28 09:18:39.215934505 +0000 UTC m=+910.906503842" lastFinishedPulling="2026-02-28 09:18:49.524896179 +0000 UTC m=+921.215465516" observedRunningTime="2026-02-28 09:18:56.553907581 +0000 UTC m=+928.244476918" watchObservedRunningTime="2026-02-28 09:18:56.558674376 +0000 UTC m=+928.249243713" Feb 28 09:18:57 crc kubenswrapper[4687]: I0228 09:18:57.559570 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 28 09:18:58 crc kubenswrapper[4687]: I0228 09:18:58.269206 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.136129 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.344185 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 28 09:18:59 crc kubenswrapper[4687]: E0228 09:18:59.344533 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerName="dnsmasq-dns" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.344553 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerName="dnsmasq-dns" Feb 28 09:18:59 crc kubenswrapper[4687]: E0228 09:18:59.344572 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerName="init" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.344580 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerName="init" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.344744 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb60d0e-2588-4d0d-abf1-afef4e684fc1" containerName="dnsmasq-dns" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.345567 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.347160 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-s27ns" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.347182 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.348047 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.350412 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.354970 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445408 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1dc2d-f77e-447b-836c-d485426a72c2-scripts\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445452 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1dc2d-f77e-447b-836c-d485426a72c2-config\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445504 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4lxk\" (UniqueName: \"kubernetes.io/projected/99f1dc2d-f77e-447b-836c-d485426a72c2-kube-api-access-r4lxk\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445564 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445596 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445635 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.445668 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1dc2d-f77e-447b-836c-d485426a72c2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.547873 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.547989 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.549095 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1dc2d-f77e-447b-836c-d485426a72c2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.549281 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1dc2d-f77e-447b-836c-d485426a72c2-config\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.549318 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1dc2d-f77e-447b-836c-d485426a72c2-scripts\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.549443 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4lxk\" (UniqueName: \"kubernetes.io/projected/99f1dc2d-f77e-447b-836c-d485426a72c2-kube-api-access-r4lxk\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.549570 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.549705 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1dc2d-f77e-447b-836c-d485426a72c2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.550370 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1dc2d-f77e-447b-836c-d485426a72c2-config\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.550522 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1dc2d-f77e-447b-836c-d485426a72c2-scripts\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.555738 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.557502 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.563032 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1dc2d-f77e-447b-836c-d485426a72c2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.566973 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4lxk\" (UniqueName: \"kubernetes.io/projected/99f1dc2d-f77e-447b-836c-d485426a72c2-kube-api-access-r4lxk\") pod \"ovn-northd-0\" (UID: \"99f1dc2d-f77e-447b-836c-d485426a72c2\") " pod="openstack/ovn-northd-0" Feb 28 09:18:59 crc kubenswrapper[4687]: I0228 09:18:59.670913 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 28 09:19:00 crc kubenswrapper[4687]: I0228 09:19:00.069490 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 28 09:19:00 crc kubenswrapper[4687]: W0228 09:19:00.072261 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f1dc2d_f77e_447b_836c_d485426a72c2.slice/crio-748409aaa92529f6613d7b520bae42d5e99a8dffce86b92c24180a40bb83eb87 WatchSource:0}: Error finding container 748409aaa92529f6613d7b520bae42d5e99a8dffce86b92c24180a40bb83eb87: Status 404 returned error can't find the container with id 748409aaa92529f6613d7b520bae42d5e99a8dffce86b92c24180a40bb83eb87 Feb 28 09:19:00 crc kubenswrapper[4687]: I0228 09:19:00.551559 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1dc2d-f77e-447b-836c-d485426a72c2","Type":"ContainerStarted","Data":"748409aaa92529f6613d7b520bae42d5e99a8dffce86b92c24180a40bb83eb87"} Feb 28 09:19:01 crc kubenswrapper[4687]: I0228 09:19:01.578562 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 28 09:19:01 crc kubenswrapper[4687]: I0228 09:19:01.578617 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 28 09:19:01 crc kubenswrapper[4687]: I0228 09:19:01.641510 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 28 09:19:02 crc kubenswrapper[4687]: I0228 09:19:02.316285 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:19:02 crc kubenswrapper[4687]: I0228 09:19:02.365949 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-2qbft"] Feb 28 09:19:02 crc kubenswrapper[4687]: I0228 09:19:02.366199 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerName="dnsmasq-dns" containerID="cri-o://84a856ce3dc0520d63c5594629f7656ad21f3e778d8f7266847a6f325e3404be" gracePeriod=10 Feb 28 09:19:02 crc kubenswrapper[4687]: I0228 09:19:02.572912 4687 generic.go:334] "Generic (PLEG): container finished" podID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerID="84a856ce3dc0520d63c5594629f7656ad21f3e778d8f7266847a6f325e3404be" exitCode=0 Feb 28 09:19:02 crc kubenswrapper[4687]: I0228 09:19:02.573048 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" event={"ID":"f611fd7a-502d-4db5-ad7f-eae15ccd9486","Type":"ContainerDied","Data":"84a856ce3dc0520d63c5594629f7656ad21f3e778d8f7266847a6f325e3404be"} Feb 28 09:19:02 crc kubenswrapper[4687]: I0228 09:19:02.667607 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 28 09:19:03 crc kubenswrapper[4687]: I0228 09:19:03.028524 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 28 09:19:03 crc kubenswrapper[4687]: I0228 09:19:03.028580 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 28 09:19:03 crc kubenswrapper[4687]: I0228 09:19:03.086369 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 28 09:19:03 crc kubenswrapper[4687]: I0228 09:19:03.652718 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.284497 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.347842 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-598c-account-create-update-k5s5m"] Feb 28 09:19:04 crc kubenswrapper[4687]: E0228 09:19:04.348565 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerName="init" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.348656 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerName="init" Feb 28 09:19:04 crc kubenswrapper[4687]: E0228 09:19:04.349659 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerName="dnsmasq-dns" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.349751 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerName="dnsmasq-dns" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.350243 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" containerName="dnsmasq-dns" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.351186 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.357125 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.372101 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-598c-account-create-update-k5s5m"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.389144 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hggb5"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.390414 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.396662 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hggb5"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.438398 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg4j5\" (UniqueName: \"kubernetes.io/projected/f611fd7a-502d-4db5-ad7f-eae15ccd9486-kube-api-access-jg4j5\") pod \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.438483 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-config\") pod \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.438666 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-dns-svc\") pod \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\" (UID: \"f611fd7a-502d-4db5-ad7f-eae15ccd9486\") " Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.441983 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f611fd7a-502d-4db5-ad7f-eae15ccd9486-kube-api-access-jg4j5" (OuterVolumeSpecName: "kube-api-access-jg4j5") pod "f611fd7a-502d-4db5-ad7f-eae15ccd9486" (UID: "f611fd7a-502d-4db5-ad7f-eae15ccd9486"). InnerVolumeSpecName "kube-api-access-jg4j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.469632 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-config" (OuterVolumeSpecName: "config") pod "f611fd7a-502d-4db5-ad7f-eae15ccd9486" (UID: "f611fd7a-502d-4db5-ad7f-eae15ccd9486"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.470944 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f611fd7a-502d-4db5-ad7f-eae15ccd9486" (UID: "f611fd7a-502d-4db5-ad7f-eae15ccd9486"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.494004 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5jbj4"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.495125 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.499366 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5jbj4"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541154 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e84feee-0007-4202-a1b7-cf6a25ea3261-operator-scripts\") pod \"keystone-598c-account-create-update-k5s5m\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541218 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmq4\" (UniqueName: \"kubernetes.io/projected/3e84feee-0007-4202-a1b7-cf6a25ea3261-kube-api-access-ftmq4\") pod \"keystone-598c-account-create-update-k5s5m\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541395 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcvn\" (UniqueName: \"kubernetes.io/projected/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-kube-api-access-gqcvn\") pod \"keystone-db-create-hggb5\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541727 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-operator-scripts\") pod \"keystone-db-create-hggb5\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541880 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541918 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f611fd7a-502d-4db5-ad7f-eae15ccd9486-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.541930 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg4j5\" (UniqueName: \"kubernetes.io/projected/f611fd7a-502d-4db5-ad7f-eae15ccd9486-kube-api-access-jg4j5\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.589682 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.589682 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-2qbft" event={"ID":"f611fd7a-502d-4db5-ad7f-eae15ccd9486","Type":"ContainerDied","Data":"a5a0afad058d51a498acc85fae711a8fd1f2f09354d1936afc7823a0c03dc65c"} Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.589799 4687 scope.go:117] "RemoveContainer" containerID="84a856ce3dc0520d63c5594629f7656ad21f3e778d8f7266847a6f325e3404be" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.591978 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1dc2d-f77e-447b-836c-d485426a72c2","Type":"ContainerStarted","Data":"69a88785d482f6c32a2b8cec86f486b705e901bc8cad55a34f13f9757e4eb2bb"} Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.592043 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1dc2d-f77e-447b-836c-d485426a72c2","Type":"ContainerStarted","Data":"29c98415408a687c3a59a541bb61d78c816a27a723f1d2ed7bc9ec02e6db4e85"} Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.592141 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.602265 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3732-account-create-update-zndhg"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.603326 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.604849 4687 scope.go:117] "RemoveContainer" containerID="b295fbf01e0a1931dead7c7a3d99c745155d27959cf4ba695b0b34ee6b59fdeb" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.605366 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.609286 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3732-account-create-update-zndhg"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.624663 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.571482909 podStartE2EDuration="5.624642467s" podCreationTimestamp="2026-02-28 09:18:59 +0000 UTC" firstStartedPulling="2026-02-28 09:19:00.073914488 +0000 UTC m=+931.764483825" lastFinishedPulling="2026-02-28 09:19:04.127074046 +0000 UTC m=+935.817643383" observedRunningTime="2026-02-28 09:19:04.622720582 +0000 UTC m=+936.313289919" watchObservedRunningTime="2026-02-28 09:19:04.624642467 +0000 UTC m=+936.315211803" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.643704 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-operator-scripts\") pod \"keystone-db-create-hggb5\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.643851 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e84feee-0007-4202-a1b7-cf6a25ea3261-operator-scripts\") pod \"keystone-598c-account-create-update-k5s5m\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.643910 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmq4\" (UniqueName: \"kubernetes.io/projected/3e84feee-0007-4202-a1b7-cf6a25ea3261-kube-api-access-ftmq4\") pod \"keystone-598c-account-create-update-k5s5m\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.643946 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqfhf\" (UniqueName: \"kubernetes.io/projected/d8cf1bc0-26d7-4e51-895b-425350692fef-kube-api-access-pqfhf\") pod \"placement-db-create-5jbj4\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.644004 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcvn\" (UniqueName: \"kubernetes.io/projected/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-kube-api-access-gqcvn\") pod \"keystone-db-create-hggb5\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.644035 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf1bc0-26d7-4e51-895b-425350692fef-operator-scripts\") pod \"placement-db-create-5jbj4\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.644880 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-operator-scripts\") pod \"keystone-db-create-hggb5\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.644899 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e84feee-0007-4202-a1b7-cf6a25ea3261-operator-scripts\") pod \"keystone-598c-account-create-update-k5s5m\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.664300 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcvn\" (UniqueName: \"kubernetes.io/projected/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-kube-api-access-gqcvn\") pod \"keystone-db-create-hggb5\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.664318 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmq4\" (UniqueName: \"kubernetes.io/projected/3e84feee-0007-4202-a1b7-cf6a25ea3261-kube-api-access-ftmq4\") pod \"keystone-598c-account-create-update-k5s5m\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.678051 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-2qbft"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.678089 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-2qbft"] Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.684991 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.721575 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.746936 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqfhf\" (UniqueName: \"kubernetes.io/projected/d8cf1bc0-26d7-4e51-895b-425350692fef-kube-api-access-pqfhf\") pod \"placement-db-create-5jbj4\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.748378 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf1bc0-26d7-4e51-895b-425350692fef-operator-scripts\") pod \"placement-db-create-5jbj4\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.748627 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pf8\" (UniqueName: \"kubernetes.io/projected/e0ef2e04-72ff-4461-a018-d126bd85f161-kube-api-access-45pf8\") pod \"placement-3732-account-create-update-zndhg\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.748750 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ef2e04-72ff-4461-a018-d126bd85f161-operator-scripts\") pod \"placement-3732-account-create-update-zndhg\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.749423 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf1bc0-26d7-4e51-895b-425350692fef-operator-scripts\") pod \"placement-db-create-5jbj4\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.766347 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqfhf\" (UniqueName: \"kubernetes.io/projected/d8cf1bc0-26d7-4e51-895b-425350692fef-kube-api-access-pqfhf\") pod \"placement-db-create-5jbj4\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.817398 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.851053 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ef2e04-72ff-4461-a018-d126bd85f161-operator-scripts\") pod \"placement-3732-account-create-update-zndhg\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.852089 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ef2e04-72ff-4461-a018-d126bd85f161-operator-scripts\") pod \"placement-3732-account-create-update-zndhg\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.852783 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pf8\" (UniqueName: \"kubernetes.io/projected/e0ef2e04-72ff-4461-a018-d126bd85f161-kube-api-access-45pf8\") pod \"placement-3732-account-create-update-zndhg\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.871553 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pf8\" (UniqueName: \"kubernetes.io/projected/e0ef2e04-72ff-4461-a018-d126bd85f161-kube-api-access-45pf8\") pod \"placement-3732-account-create-update-zndhg\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:04 crc kubenswrapper[4687]: I0228 09:19:04.924830 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.115740 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-598c-account-create-update-k5s5m"] Feb 28 09:19:05 crc kubenswrapper[4687]: W0228 09:19:05.133340 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e84feee_0007_4202_a1b7_cf6a25ea3261.slice/crio-d0feaebdf17f6da04577311deeef8d0204f8d8773560fc7d433067bde337d6be WatchSource:0}: Error finding container d0feaebdf17f6da04577311deeef8d0204f8d8773560fc7d433067bde337d6be: Status 404 returned error can't find the container with id d0feaebdf17f6da04577311deeef8d0204f8d8773560fc7d433067bde337d6be Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.204729 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hggb5"] Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.280318 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5jbj4"] Feb 28 09:19:05 crc kubenswrapper[4687]: W0228 09:19:05.323284 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8cf1bc0_26d7_4e51_895b_425350692fef.slice/crio-e5c04ef991b4378ec6e10bf6a5cb89cec7db5cff7df6a8c9b9a2f374e7b45b6e WatchSource:0}: Error finding container e5c04ef991b4378ec6e10bf6a5cb89cec7db5cff7df6a8c9b9a2f374e7b45b6e: Status 404 returned error can't find the container with id e5c04ef991b4378ec6e10bf6a5cb89cec7db5cff7df6a8c9b9a2f374e7b45b6e Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.394775 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3732-account-create-update-zndhg"] Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.463800 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-289rq"] Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.467064 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.484563 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-289rq"] Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.568434 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-config\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.568492 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-dns-svc\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.568567 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.568612 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf245\" (UniqueName: \"kubernetes.io/projected/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-kube-api-access-zf245\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.568664 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.610925 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hggb5" event={"ID":"9d6ebc98-5929-43f4-8973-a8036ba6b8ca","Type":"ContainerStarted","Data":"f230721f719bbed5c79d6aba813ad4b0a576e7fa8df89bcf3430a86d08efa913"} Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.610974 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hggb5" event={"ID":"9d6ebc98-5929-43f4-8973-a8036ba6b8ca","Type":"ContainerStarted","Data":"ecd1e5934a83f7b3242c7fce93738dc46b1e6d1ad7eb25ed5a7dd70dc918e627"} Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.616671 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-598c-account-create-update-k5s5m" event={"ID":"3e84feee-0007-4202-a1b7-cf6a25ea3261","Type":"ContainerStarted","Data":"d0feaebdf17f6da04577311deeef8d0204f8d8773560fc7d433067bde337d6be"} Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.626954 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3732-account-create-update-zndhg" event={"ID":"e0ef2e04-72ff-4461-a018-d126bd85f161","Type":"ContainerStarted","Data":"dbcb400af7273e724e5e5fc000ad9086ebd167ba7ec9bfde41bb078db720d832"} Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.629539 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5jbj4" event={"ID":"d8cf1bc0-26d7-4e51-895b-425350692fef","Type":"ContainerStarted","Data":"e5c04ef991b4378ec6e10bf6a5cb89cec7db5cff7df6a8c9b9a2f374e7b45b6e"} Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.630239 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-hggb5" podStartSLOduration=1.6302264659999999 podStartE2EDuration="1.630226466s" podCreationTimestamp="2026-02-28 09:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:05.622909926 +0000 UTC m=+937.313479263" watchObservedRunningTime="2026-02-28 09:19:05.630226466 +0000 UTC m=+937.320795804" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.669833 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-config\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.669879 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-dns-svc\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.669923 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.669952 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf245\" (UniqueName: \"kubernetes.io/projected/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-kube-api-access-zf245\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.669979 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.670768 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-config\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.671166 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.671429 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.671932 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-dns-svc\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.695285 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf245\" (UniqueName: \"kubernetes.io/projected/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-kube-api-access-zf245\") pod \"dnsmasq-dns-675f7dd995-289rq\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:05 crc kubenswrapper[4687]: I0228 09:19:05.870679 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.247623 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-289rq"] Feb 28 09:19:06 crc kubenswrapper[4687]: W0228 09:19:06.251566 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd32ea7d_aac0_4f3a_87fb_71e34e00889d.slice/crio-a236a7cb9a01d032592f9dcb07915aeff4c220574f3d0b4fb5f4e98a4d80258b WatchSource:0}: Error finding container a236a7cb9a01d032592f9dcb07915aeff4c220574f3d0b4fb5f4e98a4d80258b: Status 404 returned error can't find the container with id a236a7cb9a01d032592f9dcb07915aeff4c220574f3d0b4fb5f4e98a4d80258b Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.640032 4687 generic.go:334] "Generic (PLEG): container finished" podID="d8cf1bc0-26d7-4e51-895b-425350692fef" containerID="edbf5171945cc4d9e4218ee79894987ce70f92e1629e0e9f8b5fb8f09e7ad5d0" exitCode=0 Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.640143 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5jbj4" event={"ID":"d8cf1bc0-26d7-4e51-895b-425350692fef","Type":"ContainerDied","Data":"edbf5171945cc4d9e4218ee79894987ce70f92e1629e0e9f8b5fb8f09e7ad5d0"} Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.641752 4687 generic.go:334] "Generic (PLEG): container finished" podID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerID="03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106" exitCode=0 Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.641862 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-289rq" event={"ID":"bd32ea7d-aac0-4f3a-87fb-71e34e00889d","Type":"ContainerDied","Data":"03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106"} Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.641924 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-289rq" event={"ID":"bd32ea7d-aac0-4f3a-87fb-71e34e00889d","Type":"ContainerStarted","Data":"a236a7cb9a01d032592f9dcb07915aeff4c220574f3d0b4fb5f4e98a4d80258b"} Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.643677 4687 generic.go:334] "Generic (PLEG): container finished" podID="9d6ebc98-5929-43f4-8973-a8036ba6b8ca" containerID="f230721f719bbed5c79d6aba813ad4b0a576e7fa8df89bcf3430a86d08efa913" exitCode=0 Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.643789 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hggb5" event={"ID":"9d6ebc98-5929-43f4-8973-a8036ba6b8ca","Type":"ContainerDied","Data":"f230721f719bbed5c79d6aba813ad4b0a576e7fa8df89bcf3430a86d08efa913"} Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.646178 4687 generic.go:334] "Generic (PLEG): container finished" podID="3e84feee-0007-4202-a1b7-cf6a25ea3261" containerID="f97495ca2f3ee2c7437d7157ec557e2540e90cde7b6f582cacb30fcf12613fc4" exitCode=0 Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.646301 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-598c-account-create-update-k5s5m" event={"ID":"3e84feee-0007-4202-a1b7-cf6a25ea3261","Type":"ContainerDied","Data":"f97495ca2f3ee2c7437d7157ec557e2540e90cde7b6f582cacb30fcf12613fc4"} Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.648008 4687 generic.go:334] "Generic (PLEG): container finished" podID="e0ef2e04-72ff-4461-a018-d126bd85f161" containerID="6218b08f1988e15b17ea1d6e05aeacc80e08b573c118e3b0c3f4e4467e0ac852" exitCode=0 Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.648063 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3732-account-create-update-zndhg" event={"ID":"e0ef2e04-72ff-4461-a018-d126bd85f161","Type":"ContainerDied","Data":"6218b08f1988e15b17ea1d6e05aeacc80e08b573c118e3b0c3f4e4467e0ac852"} Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.665853 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f611fd7a-502d-4db5-ad7f-eae15ccd9486" path="/var/lib/kubelet/pods/f611fd7a-502d-4db5-ad7f-eae15ccd9486/volumes" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.690822 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.696910 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.701512 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.702134 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lm6j6" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.702258 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.702469 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.704083 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.794711 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f53dddde-f595-46a9-9764-dce250c7f5b0-cache\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.794772 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.794808 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.794909 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p666c\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-kube-api-access-p666c\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.794943 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f53dddde-f595-46a9-9764-dce250c7f5b0-lock\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.795079 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53dddde-f595-46a9-9764-dce250c7f5b0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.896586 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53dddde-f595-46a9-9764-dce250c7f5b0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.896727 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f53dddde-f595-46a9-9764-dce250c7f5b0-cache\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.896776 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.896829 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.896905 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p666c\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-kube-api-access-p666c\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.896947 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f53dddde-f595-46a9-9764-dce250c7f5b0-lock\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: E0228 09:19:06.897258 4687 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 28 09:19:06 crc kubenswrapper[4687]: E0228 09:19:06.897289 4687 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 28 09:19:06 crc kubenswrapper[4687]: E0228 09:19:06.897350 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift podName:f53dddde-f595-46a9-9764-dce250c7f5b0 nodeName:}" failed. No retries permitted until 2026-02-28 09:19:07.397332355 +0000 UTC m=+939.087901693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift") pod "swift-storage-0" (UID: "f53dddde-f595-46a9-9764-dce250c7f5b0") : configmap "swift-ring-files" not found Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.897284 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f53dddde-f595-46a9-9764-dce250c7f5b0-cache\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.897454 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.897540 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f53dddde-f595-46a9-9764-dce250c7f5b0-lock\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.903710 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53dddde-f595-46a9-9764-dce250c7f5b0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.912418 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p666c\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-kube-api-access-p666c\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:06 crc kubenswrapper[4687]: I0228 09:19:06.916427 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:07 crc kubenswrapper[4687]: I0228 09:19:07.408936 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:07 crc kubenswrapper[4687]: E0228 09:19:07.409213 4687 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 28 09:19:07 crc kubenswrapper[4687]: E0228 09:19:07.409379 4687 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 28 09:19:07 crc kubenswrapper[4687]: E0228 09:19:07.409450 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift podName:f53dddde-f595-46a9-9764-dce250c7f5b0 nodeName:}" failed. No retries permitted until 2026-02-28 09:19:08.409430805 +0000 UTC m=+940.100000142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift") pod "swift-storage-0" (UID: "f53dddde-f595-46a9-9764-dce250c7f5b0") : configmap "swift-ring-files" not found Feb 28 09:19:07 crc kubenswrapper[4687]: I0228 09:19:07.657695 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-289rq" event={"ID":"bd32ea7d-aac0-4f3a-87fb-71e34e00889d","Type":"ContainerStarted","Data":"8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01"} Feb 28 09:19:07 crc kubenswrapper[4687]: I0228 09:19:07.682616 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-289rq" podStartSLOduration=2.682597475 podStartE2EDuration="2.682597475s" podCreationTimestamp="2026-02-28 09:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:07.677653517 +0000 UTC m=+939.368222854" watchObservedRunningTime="2026-02-28 09:19:07.682597475 +0000 UTC m=+939.373166812" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.012951 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.114745 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.142055 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e84feee-0007-4202-a1b7-cf6a25ea3261-operator-scripts\") pod \"3e84feee-0007-4202-a1b7-cf6a25ea3261\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.142291 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftmq4\" (UniqueName: \"kubernetes.io/projected/3e84feee-0007-4202-a1b7-cf6a25ea3261-kube-api-access-ftmq4\") pod \"3e84feee-0007-4202-a1b7-cf6a25ea3261\" (UID: \"3e84feee-0007-4202-a1b7-cf6a25ea3261\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.142941 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e84feee-0007-4202-a1b7-cf6a25ea3261-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e84feee-0007-4202-a1b7-cf6a25ea3261" (UID: "3e84feee-0007-4202-a1b7-cf6a25ea3261"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.148050 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e84feee-0007-4202-a1b7-cf6a25ea3261-kube-api-access-ftmq4" (OuterVolumeSpecName: "kube-api-access-ftmq4") pod "3e84feee-0007-4202-a1b7-cf6a25ea3261" (UID: "3e84feee-0007-4202-a1b7-cf6a25ea3261"). InnerVolumeSpecName "kube-api-access-ftmq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.175494 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.179955 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.244140 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqfhf\" (UniqueName: \"kubernetes.io/projected/d8cf1bc0-26d7-4e51-895b-425350692fef-kube-api-access-pqfhf\") pod \"d8cf1bc0-26d7-4e51-895b-425350692fef\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.244283 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf1bc0-26d7-4e51-895b-425350692fef-operator-scripts\") pod \"d8cf1bc0-26d7-4e51-895b-425350692fef\" (UID: \"d8cf1bc0-26d7-4e51-895b-425350692fef\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.244737 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftmq4\" (UniqueName: \"kubernetes.io/projected/3e84feee-0007-4202-a1b7-cf6a25ea3261-kube-api-access-ftmq4\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.244760 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e84feee-0007-4202-a1b7-cf6a25ea3261-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.245067 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cf1bc0-26d7-4e51-895b-425350692fef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8cf1bc0-26d7-4e51-895b-425350692fef" (UID: "d8cf1bc0-26d7-4e51-895b-425350692fef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.252867 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8cf1bc0-26d7-4e51-895b-425350692fef-kube-api-access-pqfhf" (OuterVolumeSpecName: "kube-api-access-pqfhf") pod "d8cf1bc0-26d7-4e51-895b-425350692fef" (UID: "d8cf1bc0-26d7-4e51-895b-425350692fef"). InnerVolumeSpecName "kube-api-access-pqfhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.346198 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ef2e04-72ff-4461-a018-d126bd85f161-operator-scripts\") pod \"e0ef2e04-72ff-4461-a018-d126bd85f161\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.346277 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-operator-scripts\") pod \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.346517 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pf8\" (UniqueName: \"kubernetes.io/projected/e0ef2e04-72ff-4461-a018-d126bd85f161-kube-api-access-45pf8\") pod \"e0ef2e04-72ff-4461-a018-d126bd85f161\" (UID: \"e0ef2e04-72ff-4461-a018-d126bd85f161\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.346612 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcvn\" (UniqueName: \"kubernetes.io/projected/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-kube-api-access-gqcvn\") pod \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\" (UID: \"9d6ebc98-5929-43f4-8973-a8036ba6b8ca\") " Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.346820 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ef2e04-72ff-4461-a018-d126bd85f161-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0ef2e04-72ff-4461-a018-d126bd85f161" (UID: "e0ef2e04-72ff-4461-a018-d126bd85f161"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.346933 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d6ebc98-5929-43f4-8973-a8036ba6b8ca" (UID: "9d6ebc98-5929-43f4-8973-a8036ba6b8ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.347433 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqfhf\" (UniqueName: \"kubernetes.io/projected/d8cf1bc0-26d7-4e51-895b-425350692fef-kube-api-access-pqfhf\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.347458 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf1bc0-26d7-4e51-895b-425350692fef-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.347467 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ef2e04-72ff-4461-a018-d126bd85f161-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.347478 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.350237 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ef2e04-72ff-4461-a018-d126bd85f161-kube-api-access-45pf8" (OuterVolumeSpecName: "kube-api-access-45pf8") pod "e0ef2e04-72ff-4461-a018-d126bd85f161" (UID: "e0ef2e04-72ff-4461-a018-d126bd85f161"). InnerVolumeSpecName "kube-api-access-45pf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.350744 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-kube-api-access-gqcvn" (OuterVolumeSpecName: "kube-api-access-gqcvn") pod "9d6ebc98-5929-43f4-8973-a8036ba6b8ca" (UID: "9d6ebc98-5929-43f4-8973-a8036ba6b8ca"). InnerVolumeSpecName "kube-api-access-gqcvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.395749 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-knnj5"] Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.396124 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6ebc98-5929-43f4-8973-a8036ba6b8ca" containerName="mariadb-database-create" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396144 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6ebc98-5929-43f4-8973-a8036ba6b8ca" containerName="mariadb-database-create" Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.396171 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cf1bc0-26d7-4e51-895b-425350692fef" containerName="mariadb-database-create" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396178 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cf1bc0-26d7-4e51-895b-425350692fef" containerName="mariadb-database-create" Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.396207 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ef2e04-72ff-4461-a018-d126bd85f161" containerName="mariadb-account-create-update" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396212 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ef2e04-72ff-4461-a018-d126bd85f161" containerName="mariadb-account-create-update" Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.396234 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e84feee-0007-4202-a1b7-cf6a25ea3261" containerName="mariadb-account-create-update" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396241 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e84feee-0007-4202-a1b7-cf6a25ea3261" containerName="mariadb-account-create-update" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396391 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ef2e04-72ff-4461-a018-d126bd85f161" containerName="mariadb-account-create-update" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396405 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6ebc98-5929-43f4-8973-a8036ba6b8ca" containerName="mariadb-database-create" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396421 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8cf1bc0-26d7-4e51-895b-425350692fef" containerName="mariadb-database-create" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396432 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e84feee-0007-4202-a1b7-cf6a25ea3261" containerName="mariadb-account-create-update" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.396965 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.405829 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-knnj5"] Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.449245 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6k86\" (UniqueName: \"kubernetes.io/projected/d2212f7e-7ffb-4643-9f92-151ac33b6062-kube-api-access-w6k86\") pod \"glance-db-create-knnj5\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.449299 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2212f7e-7ffb-4643-9f92-151ac33b6062-operator-scripts\") pod \"glance-db-create-knnj5\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.449336 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.449430 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pf8\" (UniqueName: \"kubernetes.io/projected/e0ef2e04-72ff-4461-a018-d126bd85f161-kube-api-access-45pf8\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.449448 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqcvn\" (UniqueName: \"kubernetes.io/projected/9d6ebc98-5929-43f4-8973-a8036ba6b8ca-kube-api-access-gqcvn\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.449510 4687 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.449528 4687 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.449576 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift podName:f53dddde-f595-46a9-9764-dce250c7f5b0 nodeName:}" failed. No retries permitted until 2026-02-28 09:19:10.449560371 +0000 UTC m=+942.140129709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift") pod "swift-storage-0" (UID: "f53dddde-f595-46a9-9764-dce250c7f5b0") : configmap "swift-ring-files" not found Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.523276 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7b4d-account-create-update-6cbx6"] Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.524376 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.526879 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.529962 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7b4d-account-create-update-6cbx6"] Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.551375 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2212f7e-7ffb-4643-9f92-151ac33b6062-operator-scripts\") pod \"glance-db-create-knnj5\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.551454 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqbf\" (UniqueName: \"kubernetes.io/projected/05bb3807-5df1-422e-bc01-f42bec6ed506-kube-api-access-fsqbf\") pod \"glance-7b4d-account-create-update-6cbx6\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.551619 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6k86\" (UniqueName: \"kubernetes.io/projected/d2212f7e-7ffb-4643-9f92-151ac33b6062-kube-api-access-w6k86\") pod \"glance-db-create-knnj5\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.551679 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bb3807-5df1-422e-bc01-f42bec6ed506-operator-scripts\") pod \"glance-7b4d-account-create-update-6cbx6\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.552178 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2212f7e-7ffb-4643-9f92-151ac33b6062-operator-scripts\") pod \"glance-db-create-knnj5\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.566245 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6k86\" (UniqueName: \"kubernetes.io/projected/d2212f7e-7ffb-4643-9f92-151ac33b6062-kube-api-access-w6k86\") pod \"glance-db-create-knnj5\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.652662 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqbf\" (UniqueName: \"kubernetes.io/projected/05bb3807-5df1-422e-bc01-f42bec6ed506-kube-api-access-fsqbf\") pod \"glance-7b4d-account-create-update-6cbx6\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.652787 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bb3807-5df1-422e-bc01-f42bec6ed506-operator-scripts\") pod \"glance-7b4d-account-create-update-6cbx6\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.653486 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bb3807-5df1-422e-bc01-f42bec6ed506-operator-scripts\") pod \"glance-7b4d-account-create-update-6cbx6\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.670400 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqbf\" (UniqueName: \"kubernetes.io/projected/05bb3807-5df1-422e-bc01-f42bec6ed506-kube-api-access-fsqbf\") pod \"glance-7b4d-account-create-update-6cbx6\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.670884 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3732-account-create-update-zndhg" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.673406 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3732-account-create-update-zndhg" event={"ID":"e0ef2e04-72ff-4461-a018-d126bd85f161","Type":"ContainerDied","Data":"dbcb400af7273e724e5e5fc000ad9086ebd167ba7ec9bfde41bb078db720d832"} Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.673451 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbcb400af7273e724e5e5fc000ad9086ebd167ba7ec9bfde41bb078db720d832" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.674636 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5jbj4" event={"ID":"d8cf1bc0-26d7-4e51-895b-425350692fef","Type":"ContainerDied","Data":"e5c04ef991b4378ec6e10bf6a5cb89cec7db5cff7df6a8c9b9a2f374e7b45b6e"} Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.674688 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c04ef991b4378ec6e10bf6a5cb89cec7db5cff7df6a8c9b9a2f374e7b45b6e" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.674759 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5jbj4" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.682446 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hggb5" event={"ID":"9d6ebc98-5929-43f4-8973-a8036ba6b8ca","Type":"ContainerDied","Data":"ecd1e5934a83f7b3242c7fce93738dc46b1e6d1ad7eb25ed5a7dd70dc918e627"} Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.682486 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecd1e5934a83f7b3242c7fce93738dc46b1e6d1ad7eb25ed5a7dd70dc918e627" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.682745 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hggb5" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.685257 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-598c-account-create-update-k5s5m" event={"ID":"3e84feee-0007-4202-a1b7-cf6a25ea3261","Type":"ContainerDied","Data":"d0feaebdf17f6da04577311deeef8d0204f8d8773560fc7d433067bde337d6be"} Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.685309 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0feaebdf17f6da04577311deeef8d0204f8d8773560fc7d433067bde337d6be" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.685338 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.685349 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-598c-account-create-update-k5s5m" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.719534 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knnj5" Feb 28 09:19:08 crc kubenswrapper[4687]: E0228 09:19:08.811803 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ef2e04_72ff_4461_a018_d126bd85f161.slice/crio-dbcb400af7273e724e5e5fc000ad9086ebd167ba7ec9bfde41bb078db720d832\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e84feee_0007_4202_a1b7_cf6a25ea3261.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d6ebc98_5929_43f4_8973_a8036ba6b8ca.slice/crio-ecd1e5934a83f7b3242c7fce93738dc46b1e6d1ad7eb25ed5a7dd70dc918e627\": RecentStats: unable to find data in memory cache]" Feb 28 09:19:08 crc kubenswrapper[4687]: I0228 09:19:08.839597 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.115686 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-knnj5"] Feb 28 09:19:09 crc kubenswrapper[4687]: W0228 09:19:09.254729 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05bb3807_5df1_422e_bc01_f42bec6ed506.slice/crio-b4d4051559b9d14dfb89ac8baf59be5db6959afe0093b891943b9fc42e133d29 WatchSource:0}: Error finding container b4d4051559b9d14dfb89ac8baf59be5db6959afe0093b891943b9fc42e133d29: Status 404 returned error can't find the container with id b4d4051559b9d14dfb89ac8baf59be5db6959afe0093b891943b9fc42e133d29 Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.256124 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7b4d-account-create-update-6cbx6"] Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.695173 4687 generic.go:334] "Generic (PLEG): container finished" podID="05bb3807-5df1-422e-bc01-f42bec6ed506" containerID="0239e9ab22ba287acdc4e244c1de4c8f081fa58787f957df79cefe263a124ad2" exitCode=0 Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.695349 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7b4d-account-create-update-6cbx6" event={"ID":"05bb3807-5df1-422e-bc01-f42bec6ed506","Type":"ContainerDied","Data":"0239e9ab22ba287acdc4e244c1de4c8f081fa58787f957df79cefe263a124ad2"} Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.695597 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7b4d-account-create-update-6cbx6" event={"ID":"05bb3807-5df1-422e-bc01-f42bec6ed506","Type":"ContainerStarted","Data":"b4d4051559b9d14dfb89ac8baf59be5db6959afe0093b891943b9fc42e133d29"} Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.698486 4687 generic.go:334] "Generic (PLEG): container finished" podID="d2212f7e-7ffb-4643-9f92-151ac33b6062" containerID="1bd614f3e473614c47200c859df1a3360530237034c4af005f1d4e7a1adf0aca" exitCode=0 Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.698558 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-knnj5" event={"ID":"d2212f7e-7ffb-4643-9f92-151ac33b6062","Type":"ContainerDied","Data":"1bd614f3e473614c47200c859df1a3360530237034c4af005f1d4e7a1adf0aca"} Feb 28 09:19:09 crc kubenswrapper[4687]: I0228 09:19:09.698645 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-knnj5" event={"ID":"d2212f7e-7ffb-4643-9f92-151ac33b6062","Type":"ContainerStarted","Data":"08302b49749344524d01328ad9635d5f97eeb855e117b61c4c77691c93df67ce"} Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.195393 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nqgnb"] Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.196494 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.201149 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.205846 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nqgnb"] Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.290702 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d13626-b4ef-4af1-b65e-0f2717745a7a-operator-scripts\") pod \"root-account-create-update-nqgnb\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.290813 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4x42\" (UniqueName: \"kubernetes.io/projected/a6d13626-b4ef-4af1-b65e-0f2717745a7a-kube-api-access-p4x42\") pod \"root-account-create-update-nqgnb\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.392297 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d13626-b4ef-4af1-b65e-0f2717745a7a-operator-scripts\") pod \"root-account-create-update-nqgnb\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.392374 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4x42\" (UniqueName: \"kubernetes.io/projected/a6d13626-b4ef-4af1-b65e-0f2717745a7a-kube-api-access-p4x42\") pod \"root-account-create-update-nqgnb\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.393683 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d13626-b4ef-4af1-b65e-0f2717745a7a-operator-scripts\") pod \"root-account-create-update-nqgnb\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.413879 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4x42\" (UniqueName: \"kubernetes.io/projected/a6d13626-b4ef-4af1-b65e-0f2717745a7a-kube-api-access-p4x42\") pod \"root-account-create-update-nqgnb\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.494079 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:10 crc kubenswrapper[4687]: E0228 09:19:10.494333 4687 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 28 09:19:10 crc kubenswrapper[4687]: E0228 09:19:10.494370 4687 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 28 09:19:10 crc kubenswrapper[4687]: E0228 09:19:10.494439 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift podName:f53dddde-f595-46a9-9764-dce250c7f5b0 nodeName:}" failed. No retries permitted until 2026-02-28 09:19:14.494419462 +0000 UTC m=+946.184988798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift") pod "swift-storage-0" (UID: "f53dddde-f595-46a9-9764-dce250c7f5b0") : configmap "swift-ring-files" not found Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.511653 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.512932 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s57nv"] Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.514066 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.515504 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.516190 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.516961 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.529237 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s57nv"] Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596498 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg62\" (UniqueName: \"kubernetes.io/projected/6cf929c8-d005-4feb-8eb4-544e89507ad9-kube-api-access-nhg62\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596735 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-dispersionconf\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596783 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6cf929c8-d005-4feb-8eb4-544e89507ad9-etc-swift\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596830 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-ring-data-devices\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596917 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-scripts\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596936 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-combined-ca-bundle\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.596957 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-swiftconf\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698574 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6cf929c8-d005-4feb-8eb4-544e89507ad9-etc-swift\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698622 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-ring-data-devices\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698674 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-scripts\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698693 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-combined-ca-bundle\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698714 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-swiftconf\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698799 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg62\" (UniqueName: \"kubernetes.io/projected/6cf929c8-d005-4feb-8eb4-544e89507ad9-kube-api-access-nhg62\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.698830 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-dispersionconf\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.699808 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-scripts\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.699850 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-ring-data-devices\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.700054 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6cf929c8-d005-4feb-8eb4-544e89507ad9-etc-swift\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.705123 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-swiftconf\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.706925 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-combined-ca-bundle\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.707491 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-dispersionconf\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.716671 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg62\" (UniqueName: \"kubernetes.io/projected/6cf929c8-d005-4feb-8eb4-544e89507ad9-kube-api-access-nhg62\") pod \"swift-ring-rebalance-s57nv\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.890819 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:10 crc kubenswrapper[4687]: I0228 09:19:10.917881 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nqgnb"] Feb 28 09:19:10 crc kubenswrapper[4687]: W0228 09:19:10.925795 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6d13626_b4ef_4af1_b65e_0f2717745a7a.slice/crio-cecadf6976d958c2b71a110fb258d79972a2258477f34e73cfc9a9c54b42decc WatchSource:0}: Error finding container cecadf6976d958c2b71a110fb258d79972a2258477f34e73cfc9a9c54b42decc: Status 404 returned error can't find the container with id cecadf6976d958c2b71a110fb258d79972a2258477f34e73cfc9a9c54b42decc Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.025348 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knnj5" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.084635 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.105592 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2212f7e-7ffb-4643-9f92-151ac33b6062-operator-scripts\") pod \"d2212f7e-7ffb-4643-9f92-151ac33b6062\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.105818 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bb3807-5df1-422e-bc01-f42bec6ed506-operator-scripts\") pod \"05bb3807-5df1-422e-bc01-f42bec6ed506\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.105861 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6k86\" (UniqueName: \"kubernetes.io/projected/d2212f7e-7ffb-4643-9f92-151ac33b6062-kube-api-access-w6k86\") pod \"d2212f7e-7ffb-4643-9f92-151ac33b6062\" (UID: \"d2212f7e-7ffb-4643-9f92-151ac33b6062\") " Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.105889 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsqbf\" (UniqueName: \"kubernetes.io/projected/05bb3807-5df1-422e-bc01-f42bec6ed506-kube-api-access-fsqbf\") pod \"05bb3807-5df1-422e-bc01-f42bec6ed506\" (UID: \"05bb3807-5df1-422e-bc01-f42bec6ed506\") " Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.106377 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2212f7e-7ffb-4643-9f92-151ac33b6062-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2212f7e-7ffb-4643-9f92-151ac33b6062" (UID: "d2212f7e-7ffb-4643-9f92-151ac33b6062"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.106413 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05bb3807-5df1-422e-bc01-f42bec6ed506-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05bb3807-5df1-422e-bc01-f42bec6ed506" (UID: "05bb3807-5df1-422e-bc01-f42bec6ed506"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.111035 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2212f7e-7ffb-4643-9f92-151ac33b6062-kube-api-access-w6k86" (OuterVolumeSpecName: "kube-api-access-w6k86") pod "d2212f7e-7ffb-4643-9f92-151ac33b6062" (UID: "d2212f7e-7ffb-4643-9f92-151ac33b6062"). InnerVolumeSpecName "kube-api-access-w6k86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.111889 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05bb3807-5df1-422e-bc01-f42bec6ed506-kube-api-access-fsqbf" (OuterVolumeSpecName: "kube-api-access-fsqbf") pod "05bb3807-5df1-422e-bc01-f42bec6ed506" (UID: "05bb3807-5df1-422e-bc01-f42bec6ed506"). InnerVolumeSpecName "kube-api-access-fsqbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.209898 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6k86\" (UniqueName: \"kubernetes.io/projected/d2212f7e-7ffb-4643-9f92-151ac33b6062-kube-api-access-w6k86\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.209949 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsqbf\" (UniqueName: \"kubernetes.io/projected/05bb3807-5df1-422e-bc01-f42bec6ed506-kube-api-access-fsqbf\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.209962 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2212f7e-7ffb-4643-9f92-151ac33b6062-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.209977 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bb3807-5df1-422e-bc01-f42bec6ed506-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.346685 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s57nv"] Feb 28 09:19:11 crc kubenswrapper[4687]: W0228 09:19:11.348165 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cf929c8_d005_4feb_8eb4_544e89507ad9.slice/crio-2c6290a7e8024c689ee1327fe9dd906f8d080e09dab438f3a32d9e19d6a5bead WatchSource:0}: Error finding container 2c6290a7e8024c689ee1327fe9dd906f8d080e09dab438f3a32d9e19d6a5bead: Status 404 returned error can't find the container with id 2c6290a7e8024c689ee1327fe9dd906f8d080e09dab438f3a32d9e19d6a5bead Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.716520 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-knnj5" event={"ID":"d2212f7e-7ffb-4643-9f92-151ac33b6062","Type":"ContainerDied","Data":"08302b49749344524d01328ad9635d5f97eeb855e117b61c4c77691c93df67ce"} Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.716953 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08302b49749344524d01328ad9635d5f97eeb855e117b61c4c77691c93df67ce" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.716550 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-knnj5" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.719203 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7b4d-account-create-update-6cbx6" event={"ID":"05bb3807-5df1-422e-bc01-f42bec6ed506","Type":"ContainerDied","Data":"b4d4051559b9d14dfb89ac8baf59be5db6959afe0093b891943b9fc42e133d29"} Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.719270 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4d4051559b9d14dfb89ac8baf59be5db6959afe0093b891943b9fc42e133d29" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.719299 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7b4d-account-create-update-6cbx6" Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.720908 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s57nv" event={"ID":"6cf929c8-d005-4feb-8eb4-544e89507ad9","Type":"ContainerStarted","Data":"2c6290a7e8024c689ee1327fe9dd906f8d080e09dab438f3a32d9e19d6a5bead"} Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.722564 4687 generic.go:334] "Generic (PLEG): container finished" podID="a6d13626-b4ef-4af1-b65e-0f2717745a7a" containerID="3e5bb00299b86f3ec0a992669cd92b2f35aa21ca0d58a4937ad5cb5f0b571e63" exitCode=0 Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.722605 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nqgnb" event={"ID":"a6d13626-b4ef-4af1-b65e-0f2717745a7a","Type":"ContainerDied","Data":"3e5bb00299b86f3ec0a992669cd92b2f35aa21ca0d58a4937ad5cb5f0b571e63"} Feb 28 09:19:11 crc kubenswrapper[4687]: I0228 09:19:11.722647 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nqgnb" event={"ID":"a6d13626-b4ef-4af1-b65e-0f2717745a7a","Type":"ContainerStarted","Data":"cecadf6976d958c2b71a110fb258d79972a2258477f34e73cfc9a9c54b42decc"} Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.047667 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.145478 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4x42\" (UniqueName: \"kubernetes.io/projected/a6d13626-b4ef-4af1-b65e-0f2717745a7a-kube-api-access-p4x42\") pod \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.145585 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d13626-b4ef-4af1-b65e-0f2717745a7a-operator-scripts\") pod \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\" (UID: \"a6d13626-b4ef-4af1-b65e-0f2717745a7a\") " Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.146210 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d13626-b4ef-4af1-b65e-0f2717745a7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6d13626-b4ef-4af1-b65e-0f2717745a7a" (UID: "a6d13626-b4ef-4af1-b65e-0f2717745a7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.147532 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d13626-b4ef-4af1-b65e-0f2717745a7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.150408 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d13626-b4ef-4af1-b65e-0f2717745a7a-kube-api-access-p4x42" (OuterVolumeSpecName: "kube-api-access-p4x42") pod "a6d13626-b4ef-4af1-b65e-0f2717745a7a" (UID: "a6d13626-b4ef-4af1-b65e-0f2717745a7a"). InnerVolumeSpecName "kube-api-access-p4x42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.249635 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4x42\" (UniqueName: \"kubernetes.io/projected/a6d13626-b4ef-4af1-b65e-0f2717745a7a-kube-api-access-p4x42\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726372 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-l9np4"] Feb 28 09:19:13 crc kubenswrapper[4687]: E0228 09:19:13.726698 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d13626-b4ef-4af1-b65e-0f2717745a7a" containerName="mariadb-account-create-update" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726711 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d13626-b4ef-4af1-b65e-0f2717745a7a" containerName="mariadb-account-create-update" Feb 28 09:19:13 crc kubenswrapper[4687]: E0228 09:19:13.726746 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05bb3807-5df1-422e-bc01-f42bec6ed506" containerName="mariadb-account-create-update" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726752 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="05bb3807-5df1-422e-bc01-f42bec6ed506" containerName="mariadb-account-create-update" Feb 28 09:19:13 crc kubenswrapper[4687]: E0228 09:19:13.726762 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2212f7e-7ffb-4643-9f92-151ac33b6062" containerName="mariadb-database-create" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726768 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2212f7e-7ffb-4643-9f92-151ac33b6062" containerName="mariadb-database-create" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726921 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d13626-b4ef-4af1-b65e-0f2717745a7a" containerName="mariadb-account-create-update" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726940 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="05bb3807-5df1-422e-bc01-f42bec6ed506" containerName="mariadb-account-create-update" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.726949 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2212f7e-7ffb-4643-9f92-151ac33b6062" containerName="mariadb-database-create" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.727500 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.731672 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.731838 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v85jk" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.742529 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l9np4"] Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.748097 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nqgnb" event={"ID":"a6d13626-b4ef-4af1-b65e-0f2717745a7a","Type":"ContainerDied","Data":"cecadf6976d958c2b71a110fb258d79972a2258477f34e73cfc9a9c54b42decc"} Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.748138 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cecadf6976d958c2b71a110fb258d79972a2258477f34e73cfc9a9c54b42decc" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.748209 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nqgnb" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.756542 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qh8x\" (UniqueName: \"kubernetes.io/projected/c8549972-64f9-4f47-a3db-42053850adb4-kube-api-access-5qh8x\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.756602 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-db-sync-config-data\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.756656 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-config-data\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.756727 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-combined-ca-bundle\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.857892 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qh8x\" (UniqueName: \"kubernetes.io/projected/c8549972-64f9-4f47-a3db-42053850adb4-kube-api-access-5qh8x\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.857944 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-db-sync-config-data\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.857972 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-config-data\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.858052 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-combined-ca-bundle\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.862467 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-combined-ca-bundle\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.863260 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-db-sync-config-data\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.863391 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-config-data\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:13 crc kubenswrapper[4687]: I0228 09:19:13.871970 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qh8x\" (UniqueName: \"kubernetes.io/projected/c8549972-64f9-4f47-a3db-42053850adb4-kube-api-access-5qh8x\") pod \"glance-db-sync-l9np4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:14 crc kubenswrapper[4687]: I0228 09:19:14.049775 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:14 crc kubenswrapper[4687]: I0228 09:19:14.570268 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:14 crc kubenswrapper[4687]: E0228 09:19:14.570471 4687 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 28 09:19:14 crc kubenswrapper[4687]: E0228 09:19:14.570487 4687 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 28 09:19:14 crc kubenswrapper[4687]: E0228 09:19:14.570540 4687 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift podName:f53dddde-f595-46a9-9764-dce250c7f5b0 nodeName:}" failed. No retries permitted until 2026-02-28 09:19:22.570520458 +0000 UTC m=+954.261089795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift") pod "swift-storage-0" (UID: "f53dddde-f595-46a9-9764-dce250c7f5b0") : configmap "swift-ring-files" not found Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.312850 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l9np4"] Feb 28 09:19:15 crc kubenswrapper[4687]: W0228 09:19:15.319715 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8549972_64f9_4f47_a3db_42053850adb4.slice/crio-e1de3648a58a6b68cfc506e7a9fb43105e359daf47f8c4a5f1101a8b3214f81d WatchSource:0}: Error finding container e1de3648a58a6b68cfc506e7a9fb43105e359daf47f8c4a5f1101a8b3214f81d: Status 404 returned error can't find the container with id e1de3648a58a6b68cfc506e7a9fb43105e359daf47f8c4a5f1101a8b3214f81d Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.765901 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s57nv" event={"ID":"6cf929c8-d005-4feb-8eb4-544e89507ad9","Type":"ContainerStarted","Data":"98838bc7a7ecbe77d6682e1c6f5c36c6c1d09d10f389d4505081e9d258fbaad7"} Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.767527 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l9np4" event={"ID":"c8549972-64f9-4f47-a3db-42053850adb4","Type":"ContainerStarted","Data":"e1de3648a58a6b68cfc506e7a9fb43105e359daf47f8c4a5f1101a8b3214f81d"} Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.787128 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s57nv" podStartSLOduration=2.228643056 podStartE2EDuration="5.787109742s" podCreationTimestamp="2026-02-28 09:19:10 +0000 UTC" firstStartedPulling="2026-02-28 09:19:11.350441606 +0000 UTC m=+943.041010942" lastFinishedPulling="2026-02-28 09:19:14.908908292 +0000 UTC m=+946.599477628" observedRunningTime="2026-02-28 09:19:15.782285027 +0000 UTC m=+947.472854364" watchObservedRunningTime="2026-02-28 09:19:15.787109742 +0000 UTC m=+947.477679079" Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.872203 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.926449 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-l4xdl"] Feb 28 09:19:15 crc kubenswrapper[4687]: I0228 09:19:15.926732 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerName="dnsmasq-dns" containerID="cri-o://31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a" gracePeriod=10 Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.355165 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.407928 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84hqq\" (UniqueName: \"kubernetes.io/projected/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-kube-api-access-84hqq\") pod \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.407999 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-dns-svc\") pod \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.408081 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-nb\") pod \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.408175 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-sb\") pod \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.408225 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-config\") pod \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\" (UID: \"53ebc6fa-0ccc-410b-9d22-5e5e978da47b\") " Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.413651 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-kube-api-access-84hqq" (OuterVolumeSpecName: "kube-api-access-84hqq") pod "53ebc6fa-0ccc-410b-9d22-5e5e978da47b" (UID: "53ebc6fa-0ccc-410b-9d22-5e5e978da47b"). InnerVolumeSpecName "kube-api-access-84hqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.440664 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53ebc6fa-0ccc-410b-9d22-5e5e978da47b" (UID: "53ebc6fa-0ccc-410b-9d22-5e5e978da47b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.441777 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53ebc6fa-0ccc-410b-9d22-5e5e978da47b" (UID: "53ebc6fa-0ccc-410b-9d22-5e5e978da47b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.442470 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-config" (OuterVolumeSpecName: "config") pod "53ebc6fa-0ccc-410b-9d22-5e5e978da47b" (UID: "53ebc6fa-0ccc-410b-9d22-5e5e978da47b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.446765 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53ebc6fa-0ccc-410b-9d22-5e5e978da47b" (UID: "53ebc6fa-0ccc-410b-9d22-5e5e978da47b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.510604 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.510632 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.510646 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84hqq\" (UniqueName: \"kubernetes.io/projected/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-kube-api-access-84hqq\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.510657 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.510665 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ebc6fa-0ccc-410b-9d22-5e5e978da47b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.676041 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nqgnb"] Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.679757 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nqgnb"] Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.782500 4687 generic.go:334] "Generic (PLEG): container finished" podID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerID="31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a" exitCode=0 Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.784065 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.784556 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" event={"ID":"53ebc6fa-0ccc-410b-9d22-5e5e978da47b","Type":"ContainerDied","Data":"31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a"} Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.784619 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-l4xdl" event={"ID":"53ebc6fa-0ccc-410b-9d22-5e5e978da47b","Type":"ContainerDied","Data":"5cb040e7890419e9bf3204e7c7fdad2816334174755a60c874d1182b656d1b6f"} Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.784663 4687 scope.go:117] "RemoveContainer" containerID="31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.817623 4687 scope.go:117] "RemoveContainer" containerID="d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.817797 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-l4xdl"] Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.824829 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-l4xdl"] Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.833710 4687 scope.go:117] "RemoveContainer" containerID="31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a" Feb 28 09:19:16 crc kubenswrapper[4687]: E0228 09:19:16.834126 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a\": container with ID starting with 31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a not found: ID does not exist" containerID="31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.834162 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a"} err="failed to get container status \"31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a\": rpc error: code = NotFound desc = could not find container \"31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a\": container with ID starting with 31810d42d60fae982297517bec5917f0eb72aab74c9b3c578aa48da2259ccf7a not found: ID does not exist" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.834189 4687 scope.go:117] "RemoveContainer" containerID="d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca" Feb 28 09:19:16 crc kubenswrapper[4687]: E0228 09:19:16.834624 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca\": container with ID starting with d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca not found: ID does not exist" containerID="d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca" Feb 28 09:19:16 crc kubenswrapper[4687]: I0228 09:19:16.834662 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca"} err="failed to get container status \"d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca\": rpc error: code = NotFound desc = could not find container \"d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca\": container with ID starting with d330cc5ff78d6b76c11114998bdbbde08f6b247cbcfc691d665c4e68f74d9cca not found: ID does not exist" Feb 28 09:19:18 crc kubenswrapper[4687]: I0228 09:19:18.667742 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" path="/var/lib/kubelet/pods/53ebc6fa-0ccc-410b-9d22-5e5e978da47b/volumes" Feb 28 09:19:18 crc kubenswrapper[4687]: I0228 09:19:18.668741 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d13626-b4ef-4af1-b65e-0f2717745a7a" path="/var/lib/kubelet/pods/a6d13626-b4ef-4af1-b65e-0f2717745a7a/volumes" Feb 28 09:19:19 crc kubenswrapper[4687]: I0228 09:19:19.726045 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 28 09:19:20 crc kubenswrapper[4687]: I0228 09:19:20.824065 4687 generic.go:334] "Generic (PLEG): container finished" podID="6cf929c8-d005-4feb-8eb4-544e89507ad9" containerID="98838bc7a7ecbe77d6682e1c6f5c36c6c1d09d10f389d4505081e9d258fbaad7" exitCode=0 Feb 28 09:19:20 crc kubenswrapper[4687]: I0228 09:19:20.824125 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s57nv" event={"ID":"6cf929c8-d005-4feb-8eb4-544e89507ad9","Type":"ContainerDied","Data":"98838bc7a7ecbe77d6682e1c6f5c36c6c1d09d10f389d4505081e9d258fbaad7"} Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.696779 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vckkp"] Feb 28 09:19:21 crc kubenswrapper[4687]: E0228 09:19:21.697341 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerName="init" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.697366 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerName="init" Feb 28 09:19:21 crc kubenswrapper[4687]: E0228 09:19:21.697384 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerName="dnsmasq-dns" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.697393 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerName="dnsmasq-dns" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.697559 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ebc6fa-0ccc-410b-9d22-5e5e978da47b" containerName="dnsmasq-dns" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.698454 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.700433 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.711827 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vckkp"] Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.800723 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f20836-ec64-4206-8f2c-4db709f61459-operator-scripts\") pod \"root-account-create-update-vckkp\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.800974 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2w5v\" (UniqueName: \"kubernetes.io/projected/48f20836-ec64-4206-8f2c-4db709f61459-kube-api-access-j2w5v\") pod \"root-account-create-update-vckkp\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.902577 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2w5v\" (UniqueName: \"kubernetes.io/projected/48f20836-ec64-4206-8f2c-4db709f61459-kube-api-access-j2w5v\") pod \"root-account-create-update-vckkp\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.902767 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f20836-ec64-4206-8f2c-4db709f61459-operator-scripts\") pod \"root-account-create-update-vckkp\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.906847 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f20836-ec64-4206-8f2c-4db709f61459-operator-scripts\") pod \"root-account-create-update-vckkp\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:21 crc kubenswrapper[4687]: I0228 09:19:21.928760 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2w5v\" (UniqueName: \"kubernetes.io/projected/48f20836-ec64-4206-8f2c-4db709f61459-kube-api-access-j2w5v\") pod \"root-account-create-update-vckkp\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:22 crc kubenswrapper[4687]: I0228 09:19:22.018169 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:22 crc kubenswrapper[4687]: I0228 09:19:22.616985 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:22 crc kubenswrapper[4687]: I0228 09:19:22.623256 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53dddde-f595-46a9-9764-dce250c7f5b0-etc-swift\") pod \"swift-storage-0\" (UID: \"f53dddde-f595-46a9-9764-dce250c7f5b0\") " pod="openstack/swift-storage-0" Feb 28 09:19:22 crc kubenswrapper[4687]: I0228 09:19:22.627150 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.634859 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-grkmn" podUID="b7837572-8dcc-409d-b8fd-c37f2af52474" containerName="ovn-controller" probeResult="failure" output=< Feb 28 09:19:23 crc kubenswrapper[4687]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 28 09:19:23 crc kubenswrapper[4687]: > Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.645916 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.647391 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kbhr4" Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.861393 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-grkmn-config-7ccl6"] Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.862481 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.864676 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.865553 4687 generic.go:334] "Generic (PLEG): container finished" podID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerID="6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068" exitCode=0 Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.865650 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"171eb8fe-deaf-4936-b51d-de02b4131b8b","Type":"ContainerDied","Data":"6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068"} Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.867831 4687 generic.go:334] "Generic (PLEG): container finished" podID="541f5799-4b5e-4767-aca7-8c3738502a06" containerID="fc6036d26129118d267b8cba85e86a89e8d8f4544e9f0c7c8c7911aa86fdebc9" exitCode=0 Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.868396 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"541f5799-4b5e-4767-aca7-8c3738502a06","Type":"ContainerDied","Data":"fc6036d26129118d267b8cba85e86a89e8d8f4544e9f0c7c8c7911aa86fdebc9"} Feb 28 09:19:23 crc kubenswrapper[4687]: I0228 09:19:23.870336 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-grkmn-config-7ccl6"] Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.051831 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-log-ovn\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.052200 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-scripts\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.052251 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run-ovn\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.052875 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggbk6\" (UniqueName: \"kubernetes.io/projected/fc151677-310d-4edc-bfee-d03a8b67487b-kube-api-access-ggbk6\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.053357 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.053521 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-additional-scripts\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.159576 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggbk6\" (UniqueName: \"kubernetes.io/projected/fc151677-310d-4edc-bfee-d03a8b67487b-kube-api-access-ggbk6\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.159729 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.159944 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-additional-scripts\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.160045 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-log-ovn\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.160152 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-scripts\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.160255 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run-ovn\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.160767 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run-ovn\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.161307 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.162180 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-additional-scripts\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.162241 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-log-ovn\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.163801 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-scripts\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.193807 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggbk6\" (UniqueName: \"kubernetes.io/projected/fc151677-310d-4edc-bfee-d03a8b67487b-kube-api-access-ggbk6\") pod \"ovn-controller-grkmn-config-7ccl6\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:24 crc kubenswrapper[4687]: I0228 09:19:24.477377 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:25 crc kubenswrapper[4687]: I0228 09:19:25.002514 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:19:25 crc kubenswrapper[4687]: I0228 09:19:25.002577 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.443564 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614218 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-dispersionconf\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614272 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-swiftconf\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614334 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6cf929c8-d005-4feb-8eb4-544e89507ad9-etc-swift\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614658 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhg62\" (UniqueName: \"kubernetes.io/projected/6cf929c8-d005-4feb-8eb4-544e89507ad9-kube-api-access-nhg62\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614702 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-combined-ca-bundle\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614738 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-ring-data-devices\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.614763 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-scripts\") pod \"6cf929c8-d005-4feb-8eb4-544e89507ad9\" (UID: \"6cf929c8-d005-4feb-8eb4-544e89507ad9\") " Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.616507 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf929c8-d005-4feb-8eb4-544e89507ad9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.619387 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.622966 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf929c8-d005-4feb-8eb4-544e89507ad9-kube-api-access-nhg62" (OuterVolumeSpecName: "kube-api-access-nhg62") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "kube-api-access-nhg62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.623914 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.635772 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-scripts" (OuterVolumeSpecName: "scripts") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.636925 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.639448 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6cf929c8-d005-4feb-8eb4-544e89507ad9" (UID: "6cf929c8-d005-4feb-8eb4-544e89507ad9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.707207 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vckkp"] Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717841 4687 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717880 4687 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717889 4687 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6cf929c8-d005-4feb-8eb4-544e89507ad9-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717899 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhg62\" (UniqueName: \"kubernetes.io/projected/6cf929c8-d005-4feb-8eb4-544e89507ad9-kube-api-access-nhg62\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717914 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf929c8-d005-4feb-8eb4-544e89507ad9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717923 4687 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.717930 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cf929c8-d005-4feb-8eb4-544e89507ad9-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:26 crc kubenswrapper[4687]: W0228 09:19:26.721005 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f20836_ec64_4206_8f2c_4db709f61459.slice/crio-fc1512a11715ee1cf3f4fed01922db489481ed2ce23a8c1741be8fa701cacd7f WatchSource:0}: Error finding container fc1512a11715ee1cf3f4fed01922db489481ed2ce23a8c1741be8fa701cacd7f: Status 404 returned error can't find the container with id fc1512a11715ee1cf3f4fed01922db489481ed2ce23a8c1741be8fa701cacd7f Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.788150 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 28 09:19:26 crc kubenswrapper[4687]: W0228 09:19:26.801301 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53dddde_f595_46a9_9764_dce250c7f5b0.slice/crio-dd2cea173a2fb3343f90a56fc8c3c64de4c8258dbf1c33daa767387544e4bfcc WatchSource:0}: Error finding container dd2cea173a2fb3343f90a56fc8c3c64de4c8258dbf1c33daa767387544e4bfcc: Status 404 returned error can't find the container with id dd2cea173a2fb3343f90a56fc8c3c64de4c8258dbf1c33daa767387544e4bfcc Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.807295 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-grkmn-config-7ccl6"] Feb 28 09:19:26 crc kubenswrapper[4687]: W0228 09:19:26.826178 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc151677_310d_4edc_bfee_d03a8b67487b.slice/crio-0aa4d4b2f978d967da76dc9ba269ed2bc13af38e8bd417e52d2903eccedf77a1 WatchSource:0}: Error finding container 0aa4d4b2f978d967da76dc9ba269ed2bc13af38e8bd417e52d2903eccedf77a1: Status 404 returned error can't find the container with id 0aa4d4b2f978d967da76dc9ba269ed2bc13af38e8bd417e52d2903eccedf77a1 Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.895574 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-grkmn-config-7ccl6" event={"ID":"fc151677-310d-4edc-bfee-d03a8b67487b","Type":"ContainerStarted","Data":"0aa4d4b2f978d967da76dc9ba269ed2bc13af38e8bd417e52d2903eccedf77a1"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.897871 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l9np4" event={"ID":"c8549972-64f9-4f47-a3db-42053850adb4","Type":"ContainerStarted","Data":"5471f01c51c2f9c5c3073b547ee63f530f92e564980b01ee9de3b8792e11deba"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.900452 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"171eb8fe-deaf-4936-b51d-de02b4131b8b","Type":"ContainerStarted","Data":"96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.900770 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.902378 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vckkp" event={"ID":"48f20836-ec64-4206-8f2c-4db709f61459","Type":"ContainerStarted","Data":"bcf5ef88e87919d4c1b68e62847cbfc6b2632c64d9b9b64f06ca5273977960a1"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.902413 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vckkp" event={"ID":"48f20836-ec64-4206-8f2c-4db709f61459","Type":"ContainerStarted","Data":"fc1512a11715ee1cf3f4fed01922db489481ed2ce23a8c1741be8fa701cacd7f"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.904972 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"541f5799-4b5e-4767-aca7-8c3738502a06","Type":"ContainerStarted","Data":"cc10b6b23a3eab63c3944f46eeb03c0ab55ae001902fe5a9f2a6bae319a6709d"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.905217 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.907371 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s57nv" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.907378 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s57nv" event={"ID":"6cf929c8-d005-4feb-8eb4-544e89507ad9","Type":"ContainerDied","Data":"2c6290a7e8024c689ee1327fe9dd906f8d080e09dab438f3a32d9e19d6a5bead"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.907427 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c6290a7e8024c689ee1327fe9dd906f8d080e09dab438f3a32d9e19d6a5bead" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.910042 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"dd2cea173a2fb3343f90a56fc8c3c64de4c8258dbf1c33daa767387544e4bfcc"} Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.919939 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-l9np4" podStartSLOduration=2.933494642 podStartE2EDuration="13.919929865s" podCreationTimestamp="2026-02-28 09:19:13 +0000 UTC" firstStartedPulling="2026-02-28 09:19:15.321647108 +0000 UTC m=+947.012216445" lastFinishedPulling="2026-02-28 09:19:26.308082331 +0000 UTC m=+957.998651668" observedRunningTime="2026-02-28 09:19:26.914279868 +0000 UTC m=+958.604849206" watchObservedRunningTime="2026-02-28 09:19:26.919929865 +0000 UTC m=+958.610499202" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.944443 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.899102742 podStartE2EDuration="57.944425068s" podCreationTimestamp="2026-02-28 09:18:29 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.445828963 +0000 UTC m=+912.136398300" lastFinishedPulling="2026-02-28 09:18:49.49115129 +0000 UTC m=+921.181720626" observedRunningTime="2026-02-28 09:19:26.941854883 +0000 UTC m=+958.632424221" watchObservedRunningTime="2026-02-28 09:19:26.944425068 +0000 UTC m=+958.634994404" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.974881 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.410584483 podStartE2EDuration="58.97485913s" podCreationTimestamp="2026-02-28 09:18:28 +0000 UTC" firstStartedPulling="2026-02-28 09:18:40.41480475 +0000 UTC m=+912.105374088" lastFinishedPulling="2026-02-28 09:18:49.979079398 +0000 UTC m=+921.669648735" observedRunningTime="2026-02-28 09:19:26.972734985 +0000 UTC m=+958.663304322" watchObservedRunningTime="2026-02-28 09:19:26.97485913 +0000 UTC m=+958.665428467" Feb 28 09:19:26 crc kubenswrapper[4687]: I0228 09:19:26.991378 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-vckkp" podStartSLOduration=5.991361969 podStartE2EDuration="5.991361969s" podCreationTimestamp="2026-02-28 09:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:26.986769562 +0000 UTC m=+958.677338900" watchObservedRunningTime="2026-02-28 09:19:26.991361969 +0000 UTC m=+958.681931307" Feb 28 09:19:27 crc kubenswrapper[4687]: I0228 09:19:27.925483 4687 generic.go:334] "Generic (PLEG): container finished" podID="fc151677-310d-4edc-bfee-d03a8b67487b" containerID="c15344c70aef7423dfc2e08971dd45b9339c447cf7155ff2fb15d14bf09fdc1d" exitCode=0 Feb 28 09:19:27 crc kubenswrapper[4687]: I0228 09:19:27.925603 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-grkmn-config-7ccl6" event={"ID":"fc151677-310d-4edc-bfee-d03a8b67487b","Type":"ContainerDied","Data":"c15344c70aef7423dfc2e08971dd45b9339c447cf7155ff2fb15d14bf09fdc1d"} Feb 28 09:19:27 crc kubenswrapper[4687]: I0228 09:19:27.931659 4687 generic.go:334] "Generic (PLEG): container finished" podID="48f20836-ec64-4206-8f2c-4db709f61459" containerID="bcf5ef88e87919d4c1b68e62847cbfc6b2632c64d9b9b64f06ca5273977960a1" exitCode=0 Feb 28 09:19:27 crc kubenswrapper[4687]: I0228 09:19:27.931836 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vckkp" event={"ID":"48f20836-ec64-4206-8f2c-4db709f61459","Type":"ContainerDied","Data":"bcf5ef88e87919d4c1b68e62847cbfc6b2632c64d9b9b64f06ca5273977960a1"} Feb 28 09:19:28 crc kubenswrapper[4687]: I0228 09:19:28.640080 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-grkmn" Feb 28 09:19:28 crc kubenswrapper[4687]: I0228 09:19:28.941740 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"7d132a38d104f28ef77b2040f129f5f53e69ee731b6b541458d0d4a776cc05a7"} Feb 28 09:19:28 crc kubenswrapper[4687]: I0228 09:19:28.942003 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"13f7f56b40a1544d5bf562c85f090f4824a5fd0fde5524b062247976150f3255"} Feb 28 09:19:28 crc kubenswrapper[4687]: I0228 09:19:28.942041 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"696ba22de36a57c69ff40e47f5167d7b3e874fcd6c348d324bcf64229182d750"} Feb 28 09:19:28 crc kubenswrapper[4687]: I0228 09:19:28.942050 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"dff805bf62fb3abdddfb82e5bfe5b2fa6cab49de88218ec38eb1985d6ae2a14b"} Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.354288 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.359129 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.499831 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run-ovn\") pod \"fc151677-310d-4edc-bfee-d03a8b67487b\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500255 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-scripts\") pod \"fc151677-310d-4edc-bfee-d03a8b67487b\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500285 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggbk6\" (UniqueName: \"kubernetes.io/projected/fc151677-310d-4edc-bfee-d03a8b67487b-kube-api-access-ggbk6\") pod \"fc151677-310d-4edc-bfee-d03a8b67487b\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500407 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run\") pod \"fc151677-310d-4edc-bfee-d03a8b67487b\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500430 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-log-ovn\") pod \"fc151677-310d-4edc-bfee-d03a8b67487b\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500448 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f20836-ec64-4206-8f2c-4db709f61459-operator-scripts\") pod \"48f20836-ec64-4206-8f2c-4db709f61459\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500470 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2w5v\" (UniqueName: \"kubernetes.io/projected/48f20836-ec64-4206-8f2c-4db709f61459-kube-api-access-j2w5v\") pod \"48f20836-ec64-4206-8f2c-4db709f61459\" (UID: \"48f20836-ec64-4206-8f2c-4db709f61459\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.500488 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-additional-scripts\") pod \"fc151677-310d-4edc-bfee-d03a8b67487b\" (UID: \"fc151677-310d-4edc-bfee-d03a8b67487b\") " Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.499978 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fc151677-310d-4edc-bfee-d03a8b67487b" (UID: "fc151677-310d-4edc-bfee-d03a8b67487b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.501321 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fc151677-310d-4edc-bfee-d03a8b67487b" (UID: "fc151677-310d-4edc-bfee-d03a8b67487b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.501898 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-scripts" (OuterVolumeSpecName: "scripts") pod "fc151677-310d-4edc-bfee-d03a8b67487b" (UID: "fc151677-310d-4edc-bfee-d03a8b67487b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.501893 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fc151677-310d-4edc-bfee-d03a8b67487b" (UID: "fc151677-310d-4edc-bfee-d03a8b67487b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.502258 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f20836-ec64-4206-8f2c-4db709f61459-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48f20836-ec64-4206-8f2c-4db709f61459" (UID: "48f20836-ec64-4206-8f2c-4db709f61459"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.502545 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run" (OuterVolumeSpecName: "var-run") pod "fc151677-310d-4edc-bfee-d03a8b67487b" (UID: "fc151677-310d-4edc-bfee-d03a8b67487b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.508633 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc151677-310d-4edc-bfee-d03a8b67487b-kube-api-access-ggbk6" (OuterVolumeSpecName: "kube-api-access-ggbk6") pod "fc151677-310d-4edc-bfee-d03a8b67487b" (UID: "fc151677-310d-4edc-bfee-d03a8b67487b"). InnerVolumeSpecName "kube-api-access-ggbk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.508916 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f20836-ec64-4206-8f2c-4db709f61459-kube-api-access-j2w5v" (OuterVolumeSpecName: "kube-api-access-j2w5v") pod "48f20836-ec64-4206-8f2c-4db709f61459" (UID: "48f20836-ec64-4206-8f2c-4db709f61459"). InnerVolumeSpecName "kube-api-access-j2w5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.601938 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602204 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggbk6\" (UniqueName: \"kubernetes.io/projected/fc151677-310d-4edc-bfee-d03a8b67487b-kube-api-access-ggbk6\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602284 4687 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602349 4687 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602408 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f20836-ec64-4206-8f2c-4db709f61459-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602460 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2w5v\" (UniqueName: \"kubernetes.io/projected/48f20836-ec64-4206-8f2c-4db709f61459-kube-api-access-j2w5v\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602515 4687 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fc151677-310d-4edc-bfee-d03a8b67487b-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.602567 4687 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc151677-310d-4edc-bfee-d03a8b67487b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.954481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vckkp" event={"ID":"48f20836-ec64-4206-8f2c-4db709f61459","Type":"ContainerDied","Data":"fc1512a11715ee1cf3f4fed01922db489481ed2ce23a8c1741be8fa701cacd7f"} Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.954582 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc1512a11715ee1cf3f4fed01922db489481ed2ce23a8c1741be8fa701cacd7f" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.954523 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vckkp" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.957674 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-grkmn-config-7ccl6" event={"ID":"fc151677-310d-4edc-bfee-d03a8b67487b","Type":"ContainerDied","Data":"0aa4d4b2f978d967da76dc9ba269ed2bc13af38e8bd417e52d2903eccedf77a1"} Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.957830 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa4d4b2f978d967da76dc9ba269ed2bc13af38e8bd417e52d2903eccedf77a1" Feb 28 09:19:29 crc kubenswrapper[4687]: I0228 09:19:29.957755 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-grkmn-config-7ccl6" Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.452224 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-grkmn-config-7ccl6"] Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.458194 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-grkmn-config-7ccl6"] Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.667331 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc151677-310d-4edc-bfee-d03a8b67487b" path="/var/lib/kubelet/pods/fc151677-310d-4edc-bfee-d03a8b67487b/volumes" Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.969508 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"57824061cf07efd18b917b320349fd62cb6b638b207d3369021c6cd3fd028bef"} Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.969572 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"7760c364cb9d9868bda014052cc6bf4b5348df6cc0987adb642629073903b8cf"} Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.971514 4687 generic.go:334] "Generic (PLEG): container finished" podID="c8549972-64f9-4f47-a3db-42053850adb4" containerID="5471f01c51c2f9c5c3073b547ee63f530f92e564980b01ee9de3b8792e11deba" exitCode=0 Feb 28 09:19:30 crc kubenswrapper[4687]: I0228 09:19:30.971577 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l9np4" event={"ID":"c8549972-64f9-4f47-a3db-42053850adb4","Type":"ContainerDied","Data":"5471f01c51c2f9c5c3073b547ee63f530f92e564980b01ee9de3b8792e11deba"} Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.310686 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.351789 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-config-data\") pod \"c8549972-64f9-4f47-a3db-42053850adb4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.351849 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-db-sync-config-data\") pod \"c8549972-64f9-4f47-a3db-42053850adb4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.357568 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c8549972-64f9-4f47-a3db-42053850adb4" (UID: "c8549972-64f9-4f47-a3db-42053850adb4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.387731 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-config-data" (OuterVolumeSpecName: "config-data") pod "c8549972-64f9-4f47-a3db-42053850adb4" (UID: "c8549972-64f9-4f47-a3db-42053850adb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.453151 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qh8x\" (UniqueName: \"kubernetes.io/projected/c8549972-64f9-4f47-a3db-42053850adb4-kube-api-access-5qh8x\") pod \"c8549972-64f9-4f47-a3db-42053850adb4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.453202 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-combined-ca-bundle\") pod \"c8549972-64f9-4f47-a3db-42053850adb4\" (UID: \"c8549972-64f9-4f47-a3db-42053850adb4\") " Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.453519 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.453541 4687 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.456631 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8549972-64f9-4f47-a3db-42053850adb4-kube-api-access-5qh8x" (OuterVolumeSpecName: "kube-api-access-5qh8x") pod "c8549972-64f9-4f47-a3db-42053850adb4" (UID: "c8549972-64f9-4f47-a3db-42053850adb4"). InnerVolumeSpecName "kube-api-access-5qh8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.470059 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8549972-64f9-4f47-a3db-42053850adb4" (UID: "c8549972-64f9-4f47-a3db-42053850adb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.554171 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8549972-64f9-4f47-a3db-42053850adb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.554211 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qh8x\" (UniqueName: \"kubernetes.io/projected/c8549972-64f9-4f47-a3db-42053850adb4-kube-api-access-5qh8x\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.989921 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l9np4" event={"ID":"c8549972-64f9-4f47-a3db-42053850adb4","Type":"ContainerDied","Data":"e1de3648a58a6b68cfc506e7a9fb43105e359daf47f8c4a5f1101a8b3214f81d"} Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.990271 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1de3648a58a6b68cfc506e7a9fb43105e359daf47f8c4a5f1101a8b3214f81d" Feb 28 09:19:32 crc kubenswrapper[4687]: I0228 09:19:32.990070 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l9np4" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.325555 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-s65t9"] Feb 28 09:19:33 crc kubenswrapper[4687]: E0228 09:19:33.326669 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf929c8-d005-4feb-8eb4-544e89507ad9" containerName="swift-ring-rebalance" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.326770 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf929c8-d005-4feb-8eb4-544e89507ad9" containerName="swift-ring-rebalance" Feb 28 09:19:33 crc kubenswrapper[4687]: E0228 09:19:33.326849 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f20836-ec64-4206-8f2c-4db709f61459" containerName="mariadb-account-create-update" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.326905 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f20836-ec64-4206-8f2c-4db709f61459" containerName="mariadb-account-create-update" Feb 28 09:19:33 crc kubenswrapper[4687]: E0228 09:19:33.326979 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc151677-310d-4edc-bfee-d03a8b67487b" containerName="ovn-config" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.327072 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc151677-310d-4edc-bfee-d03a8b67487b" containerName="ovn-config" Feb 28 09:19:33 crc kubenswrapper[4687]: E0228 09:19:33.327135 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8549972-64f9-4f47-a3db-42053850adb4" containerName="glance-db-sync" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.327188 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8549972-64f9-4f47-a3db-42053850adb4" containerName="glance-db-sync" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.327411 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf929c8-d005-4feb-8eb4-544e89507ad9" containerName="swift-ring-rebalance" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.327485 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f20836-ec64-4206-8f2c-4db709f61459" containerName="mariadb-account-create-update" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.327549 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8549972-64f9-4f47-a3db-42053850adb4" containerName="glance-db-sync" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.327597 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc151677-310d-4edc-bfee-d03a8b67487b" containerName="ovn-config" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.328753 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.345603 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-s65t9"] Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.371277 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.371408 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-config\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.371470 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.371645 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.371823 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv7js\" (UniqueName: \"kubernetes.io/projected/ab16bde4-118c-4c9b-bea3-01508818e448-kube-api-access-pv7js\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.473494 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.473602 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-config\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.473639 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.473676 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.473707 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv7js\" (UniqueName: \"kubernetes.io/projected/ab16bde4-118c-4c9b-bea3-01508818e448-kube-api-access-pv7js\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.474890 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.475492 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-config\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.475822 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.475954 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.499854 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv7js\" (UniqueName: \"kubernetes.io/projected/ab16bde4-118c-4c9b-bea3-01508818e448-kube-api-access-pv7js\") pod \"dnsmasq-dns-7f58d6bb6f-s65t9\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:33 crc kubenswrapper[4687]: I0228 09:19:33.641549 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:34 crc kubenswrapper[4687]: I0228 09:19:33.999661 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"ecf601d21266a935ea7bec75c84995ef150422539395842ac02f0b3daa90ca50"} Feb 28 09:19:34 crc kubenswrapper[4687]: I0228 09:19:33.999956 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"8e6577af4a79f4ec0022387c8e2ca8ebf9f5cae621907e063cbbd9e6a1fbb1a6"} Feb 28 09:19:34 crc kubenswrapper[4687]: I0228 09:19:34.510255 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-s65t9"] Feb 28 09:19:34 crc kubenswrapper[4687]: W0228 09:19:34.510539 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab16bde4_118c_4c9b_bea3_01508818e448.slice/crio-4fd3083930c9b0f5bbd9ecf1c56f699871da7e3409bc86cc640f12532f6c8937 WatchSource:0}: Error finding container 4fd3083930c9b0f5bbd9ecf1c56f699871da7e3409bc86cc640f12532f6c8937: Status 404 returned error can't find the container with id 4fd3083930c9b0f5bbd9ecf1c56f699871da7e3409bc86cc640f12532f6c8937 Feb 28 09:19:35 crc kubenswrapper[4687]: I0228 09:19:35.008857 4687 generic.go:334] "Generic (PLEG): container finished" podID="ab16bde4-118c-4c9b-bea3-01508818e448" containerID="e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6" exitCode=0 Feb 28 09:19:35 crc kubenswrapper[4687]: I0228 09:19:35.008953 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" event={"ID":"ab16bde4-118c-4c9b-bea3-01508818e448","Type":"ContainerDied","Data":"e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6"} Feb 28 09:19:35 crc kubenswrapper[4687]: I0228 09:19:35.009195 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" event={"ID":"ab16bde4-118c-4c9b-bea3-01508818e448","Type":"ContainerStarted","Data":"4fd3083930c9b0f5bbd9ecf1c56f699871da7e3409bc86cc640f12532f6c8937"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.024882 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" event={"ID":"ab16bde4-118c-4c9b-bea3-01508818e448","Type":"ContainerStarted","Data":"93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.025425 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.031921 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"4aae54b0773cd8cf703963fb196e86ec3a7f97f7cc6bdd4ad4c7e986edf0c693"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.031968 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"7161a2f98fd5d0196095f393a5715a46eb22cd894c12f68dc5f889ee501fb5c6"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.031980 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"43c903cb1ab1477c50f5c4a890117af6ca83980e48a784e4c2c60696fe2916e7"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.031988 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"7b2211302fcec88a0943b7e509311b53e303edddab9720f09e4841b8345ba06e"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.031997 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"96b032e3a3c9c23cb4bf5e6bcd07870e2a33f2563aede8c71945a8655b452548"} Feb 28 09:19:36 crc kubenswrapper[4687]: I0228 09:19:36.044774 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" podStartSLOduration=3.044754896 podStartE2EDuration="3.044754896s" podCreationTimestamp="2026-02-28 09:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:36.038700227 +0000 UTC m=+967.729269564" watchObservedRunningTime="2026-02-28 09:19:36.044754896 +0000 UTC m=+967.735324233" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.047475 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"31abd1334259370caa214a53b03e090cb6173804679d5f56fcc8e8a78a9486d8"} Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.048634 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53dddde-f595-46a9-9764-dce250c7f5b0","Type":"ContainerStarted","Data":"e8f1e1f341f6d59080f8bb39b6f7c9f5730675b90566b347b1f182fda7446912"} Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.084326 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.660383289 podStartE2EDuration="32.08430902s" podCreationTimestamp="2026-02-28 09:19:05 +0000 UTC" firstStartedPulling="2026-02-28 09:19:26.803564738 +0000 UTC m=+958.494134075" lastFinishedPulling="2026-02-28 09:19:35.227490469 +0000 UTC m=+966.918059806" observedRunningTime="2026-02-28 09:19:37.076453104 +0000 UTC m=+968.767022441" watchObservedRunningTime="2026-02-28 09:19:37.08430902 +0000 UTC m=+968.774878357" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.325241 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-s65t9"] Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.381806 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-2zs2p"] Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.383138 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.385321 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.409464 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-2zs2p"] Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.542853 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.543090 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.543174 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmwrs\" (UniqueName: \"kubernetes.io/projected/878defc9-19d4-48ce-92c3-9b0976de28d2-kube-api-access-gmwrs\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.543259 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.543334 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-config\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.543451 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.645063 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.645139 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.645159 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmwrs\" (UniqueName: \"kubernetes.io/projected/878defc9-19d4-48ce-92c3-9b0976de28d2-kube-api-access-gmwrs\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.645180 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.645206 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-config\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.645272 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.646154 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-config\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.646204 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.646302 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.646569 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.646758 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.661560 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmwrs\" (UniqueName: \"kubernetes.io/projected/878defc9-19d4-48ce-92c3-9b0976de28d2-kube-api-access-gmwrs\") pod \"dnsmasq-dns-75c886f8b5-2zs2p\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:37 crc kubenswrapper[4687]: I0228 09:19:37.699329 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:38 crc kubenswrapper[4687]: I0228 09:19:38.105257 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-2zs2p"] Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.063174 4687 generic.go:334] "Generic (PLEG): container finished" podID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerID="ba776bd8962fb3d4ada9f524b6f2f914f53f7ad479b2afc4460a951623bcb5cb" exitCode=0 Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.063243 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" event={"ID":"878defc9-19d4-48ce-92c3-9b0976de28d2","Type":"ContainerDied","Data":"ba776bd8962fb3d4ada9f524b6f2f914f53f7ad479b2afc4460a951623bcb5cb"} Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.063690 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" event={"ID":"878defc9-19d4-48ce-92c3-9b0976de28d2","Type":"ContainerStarted","Data":"01f6398dc076b759abccc0a91b0e944464727b20c9b9564fe5e5a3f0bb784533"} Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.063853 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" containerName="dnsmasq-dns" containerID="cri-o://93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d" gracePeriod=10 Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.409831 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.478589 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-nb\") pod \"ab16bde4-118c-4c9b-bea3-01508818e448\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.478906 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv7js\" (UniqueName: \"kubernetes.io/projected/ab16bde4-118c-4c9b-bea3-01508818e448-kube-api-access-pv7js\") pod \"ab16bde4-118c-4c9b-bea3-01508818e448\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.478964 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-dns-svc\") pod \"ab16bde4-118c-4c9b-bea3-01508818e448\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.478998 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-config\") pod \"ab16bde4-118c-4c9b-bea3-01508818e448\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.479037 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-sb\") pod \"ab16bde4-118c-4c9b-bea3-01508818e448\" (UID: \"ab16bde4-118c-4c9b-bea3-01508818e448\") " Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.485137 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab16bde4-118c-4c9b-bea3-01508818e448-kube-api-access-pv7js" (OuterVolumeSpecName: "kube-api-access-pv7js") pod "ab16bde4-118c-4c9b-bea3-01508818e448" (UID: "ab16bde4-118c-4c9b-bea3-01508818e448"). InnerVolumeSpecName "kube-api-access-pv7js". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.515725 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-config" (OuterVolumeSpecName: "config") pod "ab16bde4-118c-4c9b-bea3-01508818e448" (UID: "ab16bde4-118c-4c9b-bea3-01508818e448"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.515743 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab16bde4-118c-4c9b-bea3-01508818e448" (UID: "ab16bde4-118c-4c9b-bea3-01508818e448"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.516373 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab16bde4-118c-4c9b-bea3-01508818e448" (UID: "ab16bde4-118c-4c9b-bea3-01508818e448"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.519984 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab16bde4-118c-4c9b-bea3-01508818e448" (UID: "ab16bde4-118c-4c9b-bea3-01508818e448"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.580638 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv7js\" (UniqueName: \"kubernetes.io/projected/ab16bde4-118c-4c9b-bea3-01508818e448-kube-api-access-pv7js\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.580663 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.580674 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.580684 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:39 crc kubenswrapper[4687]: I0228 09:19:39.580693 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab16bde4-118c-4c9b-bea3-01508818e448-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.075817 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" event={"ID":"878defc9-19d4-48ce-92c3-9b0976de28d2","Type":"ContainerStarted","Data":"c71ddd519cc0345f9d2e74444dbba50f32616af11432deeaaec79043832ee2de"} Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.076199 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.078683 4687 generic.go:334] "Generic (PLEG): container finished" podID="ab16bde4-118c-4c9b-bea3-01508818e448" containerID="93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d" exitCode=0 Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.078786 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" event={"ID":"ab16bde4-118c-4c9b-bea3-01508818e448","Type":"ContainerDied","Data":"93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d"} Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.078843 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" event={"ID":"ab16bde4-118c-4c9b-bea3-01508818e448","Type":"ContainerDied","Data":"4fd3083930c9b0f5bbd9ecf1c56f699871da7e3409bc86cc640f12532f6c8937"} Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.078871 4687 scope.go:117] "RemoveContainer" containerID="93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.078745 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-s65t9" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.100694 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" podStartSLOduration=3.100680185 podStartE2EDuration="3.100680185s" podCreationTimestamp="2026-02-28 09:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:40.092932833 +0000 UTC m=+971.783502180" watchObservedRunningTime="2026-02-28 09:19:40.100680185 +0000 UTC m=+971.791249522" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.104323 4687 scope.go:117] "RemoveContainer" containerID="e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.120291 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-s65t9"] Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.129013 4687 scope.go:117] "RemoveContainer" containerID="93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d" Feb 28 09:19:40 crc kubenswrapper[4687]: E0228 09:19:40.130651 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d\": container with ID starting with 93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d not found: ID does not exist" containerID="93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.130685 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d"} err="failed to get container status \"93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d\": rpc error: code = NotFound desc = could not find container \"93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d\": container with ID starting with 93c2c956163756e0487d3c66365a36c33dbd0ebbbd99e26ed9a72c38465f445d not found: ID does not exist" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.130722 4687 scope.go:117] "RemoveContainer" containerID="e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.131176 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-s65t9"] Feb 28 09:19:40 crc kubenswrapper[4687]: E0228 09:19:40.131381 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6\": container with ID starting with e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6 not found: ID does not exist" containerID="e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.131405 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6"} err="failed to get container status \"e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6\": rpc error: code = NotFound desc = could not find container \"e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6\": container with ID starting with e7207caa0918eb9c93bd0eb13d0e4731e85aaf64b1a6a04f3fa40449b960fea6 not found: ID does not exist" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.301259 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.590226 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 28 09:19:40 crc kubenswrapper[4687]: I0228 09:19:40.665842 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" path="/var/lib/kubelet/pods/ab16bde4-118c-4c9b-bea3-01508818e448/volumes" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.007530 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bpfck"] Feb 28 09:19:42 crc kubenswrapper[4687]: E0228 09:19:42.007974 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" containerName="dnsmasq-dns" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.007990 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" containerName="dnsmasq-dns" Feb 28 09:19:42 crc kubenswrapper[4687]: E0228 09:19:42.007999 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" containerName="init" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.008005 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" containerName="init" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.008216 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab16bde4-118c-4c9b-bea3-01508818e448" containerName="dnsmasq-dns" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.008769 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.013692 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bpfck"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.046054 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1214eb91-e4cb-4337-ab5c-e27c0dd55151-operator-scripts\") pod \"cinder-db-create-bpfck\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.046133 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbng\" (UniqueName: \"kubernetes.io/projected/1214eb91-e4cb-4337-ab5c-e27c0dd55151-kube-api-access-cvbng\") pod \"cinder-db-create-bpfck\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.117241 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9dac-account-create-update-mzccc"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.118666 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.120653 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.130592 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9dac-account-create-update-mzccc"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.147252 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1214eb91-e4cb-4337-ab5c-e27c0dd55151-operator-scripts\") pod \"cinder-db-create-bpfck\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.147368 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmb4j\" (UniqueName: \"kubernetes.io/projected/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-kube-api-access-pmb4j\") pod \"cinder-9dac-account-create-update-mzccc\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.147481 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-operator-scripts\") pod \"cinder-9dac-account-create-update-mzccc\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.147551 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbng\" (UniqueName: \"kubernetes.io/projected/1214eb91-e4cb-4337-ab5c-e27c0dd55151-kube-api-access-cvbng\") pod \"cinder-db-create-bpfck\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.148134 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1214eb91-e4cb-4337-ab5c-e27c0dd55151-operator-scripts\") pod \"cinder-db-create-bpfck\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.163759 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbng\" (UniqueName: \"kubernetes.io/projected/1214eb91-e4cb-4337-ab5c-e27c0dd55151-kube-api-access-cvbng\") pod \"cinder-db-create-bpfck\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.210964 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5l5b9"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.212298 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.222116 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5l5b9"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.226698 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4a52-account-create-update-4zmtm"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.227701 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.229452 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.246787 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a52-account-create-update-4zmtm"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.248440 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmb4j\" (UniqueName: \"kubernetes.io/projected/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-kube-api-access-pmb4j\") pod \"cinder-9dac-account-create-update-mzccc\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.248487 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-operator-scripts\") pod \"cinder-9dac-account-create-update-mzccc\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.248530 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpz92\" (UniqueName: \"kubernetes.io/projected/47d4394e-e0a9-4ea7-b670-fd088aa62341-kube-api-access-tpz92\") pod \"barbican-4a52-account-create-update-4zmtm\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.248567 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcksh\" (UniqueName: \"kubernetes.io/projected/c99cb41b-642b-4dab-bd03-a8f61456a0c5-kube-api-access-vcksh\") pod \"barbican-db-create-5l5b9\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.248599 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99cb41b-642b-4dab-bd03-a8f61456a0c5-operator-scripts\") pod \"barbican-db-create-5l5b9\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.248637 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47d4394e-e0a9-4ea7-b670-fd088aa62341-operator-scripts\") pod \"barbican-4a52-account-create-update-4zmtm\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.249407 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-operator-scripts\") pod \"cinder-9dac-account-create-update-mzccc\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.277415 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmb4j\" (UniqueName: \"kubernetes.io/projected/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-kube-api-access-pmb4j\") pod \"cinder-9dac-account-create-update-mzccc\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.313058 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7nnfk"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.314262 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.321114 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7nnfk"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.325868 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.350227 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpz92\" (UniqueName: \"kubernetes.io/projected/47d4394e-e0a9-4ea7-b670-fd088aa62341-kube-api-access-tpz92\") pod \"barbican-4a52-account-create-update-4zmtm\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.350298 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcksh\" (UniqueName: \"kubernetes.io/projected/c99cb41b-642b-4dab-bd03-a8f61456a0c5-kube-api-access-vcksh\") pod \"barbican-db-create-5l5b9\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.350351 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14080cd3-b175-4324-aacc-c3c47ead6896-operator-scripts\") pod \"neutron-db-create-7nnfk\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.350384 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4xs6\" (UniqueName: \"kubernetes.io/projected/14080cd3-b175-4324-aacc-c3c47ead6896-kube-api-access-z4xs6\") pod \"neutron-db-create-7nnfk\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.350405 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99cb41b-642b-4dab-bd03-a8f61456a0c5-operator-scripts\") pod \"barbican-db-create-5l5b9\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.350463 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47d4394e-e0a9-4ea7-b670-fd088aa62341-operator-scripts\") pod \"barbican-4a52-account-create-update-4zmtm\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.351243 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47d4394e-e0a9-4ea7-b670-fd088aa62341-operator-scripts\") pod \"barbican-4a52-account-create-update-4zmtm\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.351483 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99cb41b-642b-4dab-bd03-a8f61456a0c5-operator-scripts\") pod \"barbican-db-create-5l5b9\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.368318 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpz92\" (UniqueName: \"kubernetes.io/projected/47d4394e-e0a9-4ea7-b670-fd088aa62341-kube-api-access-tpz92\") pod \"barbican-4a52-account-create-update-4zmtm\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.370938 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcksh\" (UniqueName: \"kubernetes.io/projected/c99cb41b-642b-4dab-bd03-a8f61456a0c5-kube-api-access-vcksh\") pod \"barbican-db-create-5l5b9\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.380849 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8jhf4"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.382444 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.390250 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.390334 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29qml" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.390417 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.390506 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.400233 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8jhf4"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.426753 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-73ed-account-create-update-99ctp"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.427892 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.431989 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.432134 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.439280 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73ed-account-create-update-99ctp"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.454091 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14080cd3-b175-4324-aacc-c3c47ead6896-operator-scripts\") pod \"neutron-db-create-7nnfk\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.454316 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4xs6\" (UniqueName: \"kubernetes.io/projected/14080cd3-b175-4324-aacc-c3c47ead6896-kube-api-access-z4xs6\") pod \"neutron-db-create-7nnfk\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.454946 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14080cd3-b175-4324-aacc-c3c47ead6896-operator-scripts\") pod \"neutron-db-create-7nnfk\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.468349 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4xs6\" (UniqueName: \"kubernetes.io/projected/14080cd3-b175-4324-aacc-c3c47ead6896-kube-api-access-z4xs6\") pod \"neutron-db-create-7nnfk\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.526739 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.539112 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.556487 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lr9c\" (UniqueName: \"kubernetes.io/projected/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-kube-api-access-7lr9c\") pod \"neutron-73ed-account-create-update-99ctp\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.556554 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-config-data\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.556591 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7j5\" (UniqueName: \"kubernetes.io/projected/4856ec29-c1c6-4c66-b64d-0daf938e4104-kube-api-access-4n7j5\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.557939 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-operator-scripts\") pod \"neutron-73ed-account-create-update-99ctp\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.558058 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-combined-ca-bundle\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.633582 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.659863 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lr9c\" (UniqueName: \"kubernetes.io/projected/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-kube-api-access-7lr9c\") pod \"neutron-73ed-account-create-update-99ctp\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.659926 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-config-data\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.659965 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7j5\" (UniqueName: \"kubernetes.io/projected/4856ec29-c1c6-4c66-b64d-0daf938e4104-kube-api-access-4n7j5\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.660001 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-operator-scripts\") pod \"neutron-73ed-account-create-update-99ctp\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.660045 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-combined-ca-bundle\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.661616 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-operator-scripts\") pod \"neutron-73ed-account-create-update-99ctp\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.665124 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-combined-ca-bundle\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.667629 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-config-data\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.677748 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lr9c\" (UniqueName: \"kubernetes.io/projected/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-kube-api-access-7lr9c\") pod \"neutron-73ed-account-create-update-99ctp\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.679918 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7j5\" (UniqueName: \"kubernetes.io/projected/4856ec29-c1c6-4c66-b64d-0daf938e4104-kube-api-access-4n7j5\") pod \"keystone-db-sync-8jhf4\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.714751 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.738956 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.857817 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bpfck"] Feb 28 09:19:42 crc kubenswrapper[4687]: I0228 09:19:42.920053 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9dac-account-create-update-mzccc"] Feb 28 09:19:42 crc kubenswrapper[4687]: W0228 09:19:42.927115 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae33b9ae_c76a_41e3_9497_f6cbe4f4b740.slice/crio-cbe397518795a181fcf18c4f1d45a6372f0879d76bc11d68668b5dcaeb23bba6 WatchSource:0}: Error finding container cbe397518795a181fcf18c4f1d45a6372f0879d76bc11d68668b5dcaeb23bba6: Status 404 returned error can't find the container with id cbe397518795a181fcf18c4f1d45a6372f0879d76bc11d68668b5dcaeb23bba6 Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.000585 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5l5b9"] Feb 28 09:19:43 crc kubenswrapper[4687]: W0228 09:19:43.012302 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc99cb41b_642b_4dab_bd03_a8f61456a0c5.slice/crio-d249f08a6a147ddee3dc8bf29989bf76e914ec1bdda41cecc82608daf24e249e WatchSource:0}: Error finding container d249f08a6a147ddee3dc8bf29989bf76e914ec1bdda41cecc82608daf24e249e: Status 404 returned error can't find the container with id d249f08a6a147ddee3dc8bf29989bf76e914ec1bdda41cecc82608daf24e249e Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.033914 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4a52-account-create-update-4zmtm"] Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.113607 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9dac-account-create-update-mzccc" event={"ID":"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740","Type":"ContainerStarted","Data":"8d1ee149385bb5d905ef6542e2279421190aa48afdf2729a635bca659f0a9f22"} Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.113658 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9dac-account-create-update-mzccc" event={"ID":"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740","Type":"ContainerStarted","Data":"cbe397518795a181fcf18c4f1d45a6372f0879d76bc11d68668b5dcaeb23bba6"} Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.117835 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7nnfk"] Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.129962 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpfck" event={"ID":"1214eb91-e4cb-4337-ab5c-e27c0dd55151","Type":"ContainerStarted","Data":"a793e5c63138041812e503b2870f415a857f4e1067bc1332320a92a2b68438a2"} Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.130012 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpfck" event={"ID":"1214eb91-e4cb-4337-ab5c-e27c0dd55151","Type":"ContainerStarted","Data":"239d48b7e67f688a6ab613e0ef1ea2850779c97a9e71aa3b115a6b7510f5c3b3"} Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.130958 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9dac-account-create-update-mzccc" podStartSLOduration=1.130943861 podStartE2EDuration="1.130943861s" podCreationTimestamp="2026-02-28 09:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:43.126139425 +0000 UTC m=+974.816708762" watchObservedRunningTime="2026-02-28 09:19:43.130943861 +0000 UTC m=+974.821513199" Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.138433 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a52-account-create-update-4zmtm" event={"ID":"47d4394e-e0a9-4ea7-b670-fd088aa62341","Type":"ContainerStarted","Data":"df7c14c4005e7643cf9aa4dd637db127c3b272ba49870073beae400ba3b67e19"} Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.141622 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5l5b9" event={"ID":"c99cb41b-642b-4dab-bd03-a8f61456a0c5","Type":"ContainerStarted","Data":"d249f08a6a147ddee3dc8bf29989bf76e914ec1bdda41cecc82608daf24e249e"} Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.261654 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-bpfck" podStartSLOduration=2.261630125 podStartE2EDuration="2.261630125s" podCreationTimestamp="2026-02-28 09:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:43.143664226 +0000 UTC m=+974.834233563" watchObservedRunningTime="2026-02-28 09:19:43.261630125 +0000 UTC m=+974.952199462" Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.267548 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73ed-account-create-update-99ctp"] Feb 28 09:19:43 crc kubenswrapper[4687]: I0228 09:19:43.279369 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8jhf4"] Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.157638 4687 generic.go:334] "Generic (PLEG): container finished" podID="14080cd3-b175-4324-aacc-c3c47ead6896" containerID="7a9df33dd1fc3f826946d659a6953fd6949e61530dfd83b530089fd5e576317d" exitCode=0 Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.157742 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7nnfk" event={"ID":"14080cd3-b175-4324-aacc-c3c47ead6896","Type":"ContainerDied","Data":"7a9df33dd1fc3f826946d659a6953fd6949e61530dfd83b530089fd5e576317d"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.157813 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7nnfk" event={"ID":"14080cd3-b175-4324-aacc-c3c47ead6896","Type":"ContainerStarted","Data":"726a1b0b2de3a724a0dc26ee26c15e2cb84617397c734b145a13de59495e950d"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.161300 4687 generic.go:334] "Generic (PLEG): container finished" podID="47d4394e-e0a9-4ea7-b670-fd088aa62341" containerID="cfc614ff4012eeea4ee2bb3b218f0501a82c0b2893e9d3698195e010681fb7c8" exitCode=0 Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.161388 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a52-account-create-update-4zmtm" event={"ID":"47d4394e-e0a9-4ea7-b670-fd088aa62341","Type":"ContainerDied","Data":"cfc614ff4012eeea4ee2bb3b218f0501a82c0b2893e9d3698195e010681fb7c8"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.163905 4687 generic.go:334] "Generic (PLEG): container finished" podID="15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" containerID="6797815408b770565467666b143e0cd011b644fdb43e312af592f4cf558f9d0f" exitCode=0 Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.163977 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73ed-account-create-update-99ctp" event={"ID":"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1","Type":"ContainerDied","Data":"6797815408b770565467666b143e0cd011b644fdb43e312af592f4cf558f9d0f"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.164044 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73ed-account-create-update-99ctp" event={"ID":"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1","Type":"ContainerStarted","Data":"e08e8d020cf921f60af789788cc21acaf37a448e757f135e2f02e773d5125be1"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.165121 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8jhf4" event={"ID":"4856ec29-c1c6-4c66-b64d-0daf938e4104","Type":"ContainerStarted","Data":"36c150f1a3ccc1d84b291e7f7a4fe326b5e765786a5244537dee9c3a835a2483"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.166808 4687 generic.go:334] "Generic (PLEG): container finished" podID="c99cb41b-642b-4dab-bd03-a8f61456a0c5" containerID="4256a783c481eb3f81ab67e9750afeddc57187e7417b95fb7a456df1df32422b" exitCode=0 Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.166928 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5l5b9" event={"ID":"c99cb41b-642b-4dab-bd03-a8f61456a0c5","Type":"ContainerDied","Data":"4256a783c481eb3f81ab67e9750afeddc57187e7417b95fb7a456df1df32422b"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.168436 4687 generic.go:334] "Generic (PLEG): container finished" podID="ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" containerID="8d1ee149385bb5d905ef6542e2279421190aa48afdf2729a635bca659f0a9f22" exitCode=0 Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.168494 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9dac-account-create-update-mzccc" event={"ID":"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740","Type":"ContainerDied","Data":"8d1ee149385bb5d905ef6542e2279421190aa48afdf2729a635bca659f0a9f22"} Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.170707 4687 generic.go:334] "Generic (PLEG): container finished" podID="1214eb91-e4cb-4337-ab5c-e27c0dd55151" containerID="a793e5c63138041812e503b2870f415a857f4e1067bc1332320a92a2b68438a2" exitCode=0 Feb 28 09:19:44 crc kubenswrapper[4687]: I0228 09:19:44.170751 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpfck" event={"ID":"1214eb91-e4cb-4337-ab5c-e27c0dd55151","Type":"ContainerDied","Data":"a793e5c63138041812e503b2870f415a857f4e1067bc1332320a92a2b68438a2"} Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.427321 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.433528 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.438379 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.469760 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmb4j\" (UniqueName: \"kubernetes.io/projected/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-kube-api-access-pmb4j\") pod \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.469953 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpz92\" (UniqueName: \"kubernetes.io/projected/47d4394e-e0a9-4ea7-b670-fd088aa62341-kube-api-access-tpz92\") pod \"47d4394e-e0a9-4ea7-b670-fd088aa62341\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.469999 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-operator-scripts\") pod \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\" (UID: \"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.470081 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4xs6\" (UniqueName: \"kubernetes.io/projected/14080cd3-b175-4324-aacc-c3c47ead6896-kube-api-access-z4xs6\") pod \"14080cd3-b175-4324-aacc-c3c47ead6896\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.470150 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47d4394e-e0a9-4ea7-b670-fd088aa62341-operator-scripts\") pod \"47d4394e-e0a9-4ea7-b670-fd088aa62341\" (UID: \"47d4394e-e0a9-4ea7-b670-fd088aa62341\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.470186 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14080cd3-b175-4324-aacc-c3c47ead6896-operator-scripts\") pod \"14080cd3-b175-4324-aacc-c3c47ead6896\" (UID: \"14080cd3-b175-4324-aacc-c3c47ead6896\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.471691 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" (UID: "ae33b9ae-c76a-41e3-9497-f6cbe4f4b740"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.471629 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47d4394e-e0a9-4ea7-b670-fd088aa62341-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47d4394e-e0a9-4ea7-b670-fd088aa62341" (UID: "47d4394e-e0a9-4ea7-b670-fd088aa62341"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.473276 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14080cd3-b175-4324-aacc-c3c47ead6896-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14080cd3-b175-4324-aacc-c3c47ead6896" (UID: "14080cd3-b175-4324-aacc-c3c47ead6896"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.473754 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d4394e-e0a9-4ea7-b670-fd088aa62341-kube-api-access-tpz92" (OuterVolumeSpecName: "kube-api-access-tpz92") pod "47d4394e-e0a9-4ea7-b670-fd088aa62341" (UID: "47d4394e-e0a9-4ea7-b670-fd088aa62341"). InnerVolumeSpecName "kube-api-access-tpz92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.474373 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-kube-api-access-pmb4j" (OuterVolumeSpecName: "kube-api-access-pmb4j") pod "ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" (UID: "ae33b9ae-c76a-41e3-9497-f6cbe4f4b740"). InnerVolumeSpecName "kube-api-access-pmb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.475132 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14080cd3-b175-4324-aacc-c3c47ead6896-kube-api-access-z4xs6" (OuterVolumeSpecName: "kube-api-access-z4xs6") pod "14080cd3-b175-4324-aacc-c3c47ead6896" (UID: "14080cd3-b175-4324-aacc-c3c47ead6896"). InnerVolumeSpecName "kube-api-access-z4xs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.481603 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.524001 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.527427 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.571854 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-operator-scripts\") pod \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.571935 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lr9c\" (UniqueName: \"kubernetes.io/projected/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-kube-api-access-7lr9c\") pod \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\" (UID: \"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572013 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99cb41b-642b-4dab-bd03-a8f61456a0c5-operator-scripts\") pod \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572060 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcksh\" (UniqueName: \"kubernetes.io/projected/c99cb41b-642b-4dab-bd03-a8f61456a0c5-kube-api-access-vcksh\") pod \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\" (UID: \"c99cb41b-642b-4dab-bd03-a8f61456a0c5\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572078 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbng\" (UniqueName: \"kubernetes.io/projected/1214eb91-e4cb-4337-ab5c-e27c0dd55151-kube-api-access-cvbng\") pod \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572175 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1214eb91-e4cb-4337-ab5c-e27c0dd55151-operator-scripts\") pod \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\" (UID: \"1214eb91-e4cb-4337-ab5c-e27c0dd55151\") " Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572736 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4xs6\" (UniqueName: \"kubernetes.io/projected/14080cd3-b175-4324-aacc-c3c47ead6896-kube-api-access-z4xs6\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572754 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47d4394e-e0a9-4ea7-b670-fd088aa62341-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572764 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14080cd3-b175-4324-aacc-c3c47ead6896-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572774 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmb4j\" (UniqueName: \"kubernetes.io/projected/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-kube-api-access-pmb4j\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572783 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpz92\" (UniqueName: \"kubernetes.io/projected/47d4394e-e0a9-4ea7-b670-fd088aa62341-kube-api-access-tpz92\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.572793 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.573269 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1214eb91-e4cb-4337-ab5c-e27c0dd55151-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1214eb91-e4cb-4337-ab5c-e27c0dd55151" (UID: "1214eb91-e4cb-4337-ab5c-e27c0dd55151"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.573703 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" (UID: "15ea0f78-cfa4-4a12-8e4e-92bc30488ad1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.574629 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c99cb41b-642b-4dab-bd03-a8f61456a0c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c99cb41b-642b-4dab-bd03-a8f61456a0c5" (UID: "c99cb41b-642b-4dab-bd03-a8f61456a0c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.578006 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-kube-api-access-7lr9c" (OuterVolumeSpecName: "kube-api-access-7lr9c") pod "15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" (UID: "15ea0f78-cfa4-4a12-8e4e-92bc30488ad1"). InnerVolumeSpecName "kube-api-access-7lr9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.578538 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1214eb91-e4cb-4337-ab5c-e27c0dd55151-kube-api-access-cvbng" (OuterVolumeSpecName: "kube-api-access-cvbng") pod "1214eb91-e4cb-4337-ab5c-e27c0dd55151" (UID: "1214eb91-e4cb-4337-ab5c-e27c0dd55151"). InnerVolumeSpecName "kube-api-access-cvbng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.578840 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99cb41b-642b-4dab-bd03-a8f61456a0c5-kube-api-access-vcksh" (OuterVolumeSpecName: "kube-api-access-vcksh") pod "c99cb41b-642b-4dab-bd03-a8f61456a0c5" (UID: "c99cb41b-642b-4dab-bd03-a8f61456a0c5"). InnerVolumeSpecName "kube-api-access-vcksh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.674752 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.674795 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lr9c\" (UniqueName: \"kubernetes.io/projected/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1-kube-api-access-7lr9c\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.674813 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99cb41b-642b-4dab-bd03-a8f61456a0c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.674825 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcksh\" (UniqueName: \"kubernetes.io/projected/c99cb41b-642b-4dab-bd03-a8f61456a0c5-kube-api-access-vcksh\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.674838 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbng\" (UniqueName: \"kubernetes.io/projected/1214eb91-e4cb-4337-ab5c-e27c0dd55151-kube-api-access-cvbng\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.674849 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1214eb91-e4cb-4337-ab5c-e27c0dd55151-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.701000 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.753482 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-289rq"] Feb 28 09:19:47 crc kubenswrapper[4687]: I0228 09:19:47.753708 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-289rq" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerName="dnsmasq-dns" containerID="cri-o://8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01" gracePeriod=10 Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.134598 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.182876 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-sb\") pod \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.183063 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-dns-svc\") pod \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.183171 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-nb\") pod \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.183230 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf245\" (UniqueName: \"kubernetes.io/projected/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-kube-api-access-zf245\") pod \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.183328 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-config\") pod \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\" (UID: \"bd32ea7d-aac0-4f3a-87fb-71e34e00889d\") " Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.190984 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-kube-api-access-zf245" (OuterVolumeSpecName: "kube-api-access-zf245") pod "bd32ea7d-aac0-4f3a-87fb-71e34e00889d" (UID: "bd32ea7d-aac0-4f3a-87fb-71e34e00889d"). InnerVolumeSpecName "kube-api-access-zf245". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.216896 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7nnfk" event={"ID":"14080cd3-b175-4324-aacc-c3c47ead6896","Type":"ContainerDied","Data":"726a1b0b2de3a724a0dc26ee26c15e2cb84617397c734b145a13de59495e950d"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.217077 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726a1b0b2de3a724a0dc26ee26c15e2cb84617397c734b145a13de59495e950d" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.216911 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7nnfk" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.218990 4687 generic.go:334] "Generic (PLEG): container finished" podID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerID="8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01" exitCode=0 Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.219124 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-289rq" event={"ID":"bd32ea7d-aac0-4f3a-87fb-71e34e00889d","Type":"ContainerDied","Data":"8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.219154 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-289rq" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.219192 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-289rq" event={"ID":"bd32ea7d-aac0-4f3a-87fb-71e34e00889d","Type":"ContainerDied","Data":"a236a7cb9a01d032592f9dcb07915aeff4c220574f3d0b4fb5f4e98a4d80258b"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.219220 4687 scope.go:117] "RemoveContainer" containerID="8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.220267 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-config" (OuterVolumeSpecName: "config") pod "bd32ea7d-aac0-4f3a-87fb-71e34e00889d" (UID: "bd32ea7d-aac0-4f3a-87fb-71e34e00889d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.221094 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd32ea7d-aac0-4f3a-87fb-71e34e00889d" (UID: "bd32ea7d-aac0-4f3a-87fb-71e34e00889d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.221498 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4a52-account-create-update-4zmtm" event={"ID":"47d4394e-e0a9-4ea7-b670-fd088aa62341","Type":"ContainerDied","Data":"df7c14c4005e7643cf9aa4dd637db127c3b272ba49870073beae400ba3b67e19"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.221531 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df7c14c4005e7643cf9aa4dd637db127c3b272ba49870073beae400ba3b67e19" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.221591 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4a52-account-create-update-4zmtm" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.222900 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd32ea7d-aac0-4f3a-87fb-71e34e00889d" (UID: "bd32ea7d-aac0-4f3a-87fb-71e34e00889d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.222959 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73ed-account-create-update-99ctp" event={"ID":"15ea0f78-cfa4-4a12-8e4e-92bc30488ad1","Type":"ContainerDied","Data":"e08e8d020cf921f60af789788cc21acaf37a448e757f135e2f02e773d5125be1"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.222986 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73ed-account-create-update-99ctp" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.222988 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e08e8d020cf921f60af789788cc21acaf37a448e757f135e2f02e773d5125be1" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.224453 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd32ea7d-aac0-4f3a-87fb-71e34e00889d" (UID: "bd32ea7d-aac0-4f3a-87fb-71e34e00889d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.226827 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8jhf4" event={"ID":"4856ec29-c1c6-4c66-b64d-0daf938e4104","Type":"ContainerStarted","Data":"f4a255a39b6cee4bfde8c5ec52bb5a6138c52ca1ca5f193e4739b9beb18d718a"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.231983 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5l5b9" event={"ID":"c99cb41b-642b-4dab-bd03-a8f61456a0c5","Type":"ContainerDied","Data":"d249f08a6a147ddee3dc8bf29989bf76e914ec1bdda41cecc82608daf24e249e"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.232056 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d249f08a6a147ddee3dc8bf29989bf76e914ec1bdda41cecc82608daf24e249e" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.232115 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5l5b9" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.234285 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9dac-account-create-update-mzccc" event={"ID":"ae33b9ae-c76a-41e3-9497-f6cbe4f4b740","Type":"ContainerDied","Data":"cbe397518795a181fcf18c4f1d45a6372f0879d76bc11d68668b5dcaeb23bba6"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.234318 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbe397518795a181fcf18c4f1d45a6372f0879d76bc11d68668b5dcaeb23bba6" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.234336 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9dac-account-create-update-mzccc" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.243736 4687 scope.go:117] "RemoveContainer" containerID="03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.248381 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpfck" event={"ID":"1214eb91-e4cb-4337-ab5c-e27c0dd55151","Type":"ContainerDied","Data":"239d48b7e67f688a6ab613e0ef1ea2850779c97a9e71aa3b115a6b7510f5c3b3"} Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.248411 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239d48b7e67f688a6ab613e0ef1ea2850779c97a9e71aa3b115a6b7510f5c3b3" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.248487 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpfck" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.249053 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8jhf4" podStartSLOduration=2.247333291 podStartE2EDuration="6.249039863s" podCreationTimestamp="2026-02-28 09:19:42 +0000 UTC" firstStartedPulling="2026-02-28 09:19:43.300815768 +0000 UTC m=+974.991385105" lastFinishedPulling="2026-02-28 09:19:47.302522339 +0000 UTC m=+978.993091677" observedRunningTime="2026-02-28 09:19:48.23686896 +0000 UTC m=+979.927438298" watchObservedRunningTime="2026-02-28 09:19:48.249039863 +0000 UTC m=+979.939609200" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.264517 4687 scope.go:117] "RemoveContainer" containerID="8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01" Feb 28 09:19:48 crc kubenswrapper[4687]: E0228 09:19:48.264916 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01\": container with ID starting with 8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01 not found: ID does not exist" containerID="8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.267119 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01"} err="failed to get container status \"8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01\": rpc error: code = NotFound desc = could not find container \"8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01\": container with ID starting with 8d21056435780720347f0d40902d5b56c2dd93117afec1d73fedefe239612a01 not found: ID does not exist" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.267173 4687 scope.go:117] "RemoveContainer" containerID="03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106" Feb 28 09:19:48 crc kubenswrapper[4687]: E0228 09:19:48.267531 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106\": container with ID starting with 03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106 not found: ID does not exist" containerID="03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.267606 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106"} err="failed to get container status \"03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106\": rpc error: code = NotFound desc = could not find container \"03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106\": container with ID starting with 03c6ec8b0305c644cb9128a429441e50bebc572c8ee6fa11934aebec54ca9106 not found: ID does not exist" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.284809 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.284880 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.284892 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.284902 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.284911 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf245\" (UniqueName: \"kubernetes.io/projected/bd32ea7d-aac0-4f3a-87fb-71e34e00889d-kube-api-access-zf245\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.546070 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-289rq"] Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.552416 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-289rq"] Feb 28 09:19:48 crc kubenswrapper[4687]: I0228 09:19:48.665808 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" path="/var/lib/kubelet/pods/bd32ea7d-aac0-4f3a-87fb-71e34e00889d/volumes" Feb 28 09:19:50 crc kubenswrapper[4687]: I0228 09:19:50.267353 4687 generic.go:334] "Generic (PLEG): container finished" podID="4856ec29-c1c6-4c66-b64d-0daf938e4104" containerID="f4a255a39b6cee4bfde8c5ec52bb5a6138c52ca1ca5f193e4739b9beb18d718a" exitCode=0 Feb 28 09:19:50 crc kubenswrapper[4687]: I0228 09:19:50.267426 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8jhf4" event={"ID":"4856ec29-c1c6-4c66-b64d-0daf938e4104","Type":"ContainerDied","Data":"f4a255a39b6cee4bfde8c5ec52bb5a6138c52ca1ca5f193e4739b9beb18d718a"} Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.550751 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.734306 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-combined-ca-bundle\") pod \"4856ec29-c1c6-4c66-b64d-0daf938e4104\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.734899 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-config-data\") pod \"4856ec29-c1c6-4c66-b64d-0daf938e4104\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.735172 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n7j5\" (UniqueName: \"kubernetes.io/projected/4856ec29-c1c6-4c66-b64d-0daf938e4104-kube-api-access-4n7j5\") pod \"4856ec29-c1c6-4c66-b64d-0daf938e4104\" (UID: \"4856ec29-c1c6-4c66-b64d-0daf938e4104\") " Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.741491 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4856ec29-c1c6-4c66-b64d-0daf938e4104-kube-api-access-4n7j5" (OuterVolumeSpecName: "kube-api-access-4n7j5") pod "4856ec29-c1c6-4c66-b64d-0daf938e4104" (UID: "4856ec29-c1c6-4c66-b64d-0daf938e4104"). InnerVolumeSpecName "kube-api-access-4n7j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.757932 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4856ec29-c1c6-4c66-b64d-0daf938e4104" (UID: "4856ec29-c1c6-4c66-b64d-0daf938e4104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.773682 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-config-data" (OuterVolumeSpecName: "config-data") pod "4856ec29-c1c6-4c66-b64d-0daf938e4104" (UID: "4856ec29-c1c6-4c66-b64d-0daf938e4104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.837839 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n7j5\" (UniqueName: \"kubernetes.io/projected/4856ec29-c1c6-4c66-b64d-0daf938e4104-kube-api-access-4n7j5\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.837880 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:51 crc kubenswrapper[4687]: I0228 09:19:51.837893 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856ec29-c1c6-4c66-b64d-0daf938e4104-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.288424 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8jhf4" event={"ID":"4856ec29-c1c6-4c66-b64d-0daf938e4104","Type":"ContainerDied","Data":"36c150f1a3ccc1d84b291e7f7a4fe326b5e765786a5244537dee9c3a835a2483"} Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.288474 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c150f1a3ccc1d84b291e7f7a4fe326b5e765786a5244537dee9c3a835a2483" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.288483 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8jhf4" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504185 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2lbtc"] Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504514 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerName="dnsmasq-dns" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504532 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerName="dnsmasq-dns" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504546 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504552 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504563 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14080cd3-b175-4324-aacc-c3c47ead6896" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504570 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="14080cd3-b175-4324-aacc-c3c47ead6896" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504580 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerName="init" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504586 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerName="init" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504595 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d4394e-e0a9-4ea7-b670-fd088aa62341" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504601 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d4394e-e0a9-4ea7-b670-fd088aa62341" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504624 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4856ec29-c1c6-4c66-b64d-0daf938e4104" containerName="keystone-db-sync" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504631 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4856ec29-c1c6-4c66-b64d-0daf938e4104" containerName="keystone-db-sync" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504641 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99cb41b-642b-4dab-bd03-a8f61456a0c5" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504646 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99cb41b-642b-4dab-bd03-a8f61456a0c5" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504657 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504663 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: E0228 09:19:52.504676 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1214eb91-e4cb-4337-ab5c-e27c0dd55151" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504681 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1214eb91-e4cb-4337-ab5c-e27c0dd55151" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504822 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504833 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504842 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d4394e-e0a9-4ea7-b670-fd088aa62341" containerName="mariadb-account-create-update" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504851 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd32ea7d-aac0-4f3a-87fb-71e34e00889d" containerName="dnsmasq-dns" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504860 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99cb41b-642b-4dab-bd03-a8f61456a0c5" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504872 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4856ec29-c1c6-4c66-b64d-0daf938e4104" containerName="keystone-db-sync" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504879 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="14080cd3-b175-4324-aacc-c3c47ead6896" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.504887 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1214eb91-e4cb-4337-ab5c-e27c0dd55151" containerName="mariadb-database-create" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.522802 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.534789 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2lbtc"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.589777 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-29vzx"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.591444 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.594357 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.596365 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-29vzx"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.599130 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29qml" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.599293 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.599493 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.599589 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652296 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-fernet-keys\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652340 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652374 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-svc\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652396 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-credential-keys\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652413 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-config-data\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652431 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652470 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-scripts\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652487 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2j9\" (UniqueName: \"kubernetes.io/projected/588dbc79-5684-4058-8b12-0cad014a4cc4-kube-api-access-xh2j9\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652520 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-config\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652563 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-combined-ca-bundle\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652586 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zph6k\" (UniqueName: \"kubernetes.io/projected/565a264d-399a-47d9-8273-b8ca22fdc8b6-kube-api-access-zph6k\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.652625 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.716580 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6774d8fcc9-lpttg"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.718340 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.721234 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.721419 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.721651 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-n4v2j" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.721774 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.743075 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6774d8fcc9-lpttg"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755489 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-config\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755539 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76f683cb-cc38-4cdd-a0f0-1077410b1768-horizon-secret-key\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755565 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-config-data\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755604 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-combined-ca-bundle\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755635 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zph6k\" (UniqueName: \"kubernetes.io/projected/565a264d-399a-47d9-8273-b8ca22fdc8b6-kube-api-access-zph6k\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755678 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755708 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg96\" (UniqueName: \"kubernetes.io/projected/76f683cb-cc38-4cdd-a0f0-1077410b1768-kube-api-access-4vg96\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755739 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-scripts\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755767 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-fernet-keys\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755784 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76f683cb-cc38-4cdd-a0f0-1077410b1768-logs\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755802 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755828 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-svc\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755849 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-credential-keys\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755865 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-config-data\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755882 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755920 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-scripts\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.755939 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2j9\" (UniqueName: \"kubernetes.io/projected/588dbc79-5684-4058-8b12-0cad014a4cc4-kube-api-access-xh2j9\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.756960 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-config\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.757836 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.758384 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.758879 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-svc\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.759464 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.766915 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-credential-keys\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.767076 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-scripts\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.767257 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.769207 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.769797 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-c9j72"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.772549 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.772800 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.775838 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.780567 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.780737 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.780850 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t48hh" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.781642 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-config-data\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.785473 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-combined-ca-bundle\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.788606 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2j9\" (UniqueName: \"kubernetes.io/projected/588dbc79-5684-4058-8b12-0cad014a4cc4-kube-api-access-xh2j9\") pod \"dnsmasq-dns-5985c59c55-2lbtc\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.791136 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zph6k\" (UniqueName: \"kubernetes.io/projected/565a264d-399a-47d9-8273-b8ca22fdc8b6-kube-api-access-zph6k\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.797455 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-fernet-keys\") pod \"keystone-bootstrap-29vzx\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.805590 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.814808 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-c9j72"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.840497 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856800 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76f683cb-cc38-4cdd-a0f0-1077410b1768-logs\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856840 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856868 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-db-sync-config-data\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856891 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856938 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-config-data\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856956 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856976 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76f683cb-cc38-4cdd-a0f0-1077410b1768-horizon-secret-key\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.856996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-config-data\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857044 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvxd\" (UniqueName: \"kubernetes.io/projected/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-kube-api-access-2wvxd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857061 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-combined-ca-bundle\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857075 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-scripts\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857092 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fhc\" (UniqueName: \"kubernetes.io/projected/3e5e221e-73c7-44a2-9af9-0feb60b412e0-kube-api-access-42fhc\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857122 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-config-data\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857137 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-scripts\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857153 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857168 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e5e221e-73c7-44a2-9af9-0feb60b412e0-etc-machine-id\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857186 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg96\" (UniqueName: \"kubernetes.io/projected/76f683cb-cc38-4cdd-a0f0-1077410b1768-kube-api-access-4vg96\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857204 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-scripts\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.857798 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-scripts\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.858012 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76f683cb-cc38-4cdd-a0f0-1077410b1768-logs\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.859347 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-config-data\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.867372 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76f683cb-cc38-4cdd-a0f0-1077410b1768-horizon-secret-key\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.890404 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-94db9c8bf-6qj27"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.892789 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.909662 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-94db9c8bf-6qj27"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.912836 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg96\" (UniqueName: \"kubernetes.io/projected/76f683cb-cc38-4cdd-a0f0-1077410b1768-kube-api-access-4vg96\") pod \"horizon-6774d8fcc9-lpttg\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.917014 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-br7kf"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.918068 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.918844 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.928374 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.928572 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.928736 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v7cv7" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.949452 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-br7kf"] Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966710 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966751 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-db-sync-config-data\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966779 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966828 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-config-data\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966845 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966874 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvxd\" (UniqueName: \"kubernetes.io/projected/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-kube-api-access-2wvxd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966889 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-combined-ca-bundle\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966907 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-scripts\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966924 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42fhc\" (UniqueName: \"kubernetes.io/projected/3e5e221e-73c7-44a2-9af9-0feb60b412e0-kube-api-access-42fhc\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966967 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-config-data\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966980 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-scripts\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.966996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.967012 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e5e221e-73c7-44a2-9af9-0feb60b412e0-etc-machine-id\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.967124 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e5e221e-73c7-44a2-9af9-0feb60b412e0-etc-machine-id\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.969166 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.975158 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:52 crc kubenswrapper[4687]: I0228 09:19:52.987358 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-scripts\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:52.996913 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mvkm8"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:52.997195 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvxd\" (UniqueName: \"kubernetes.io/projected/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-kube-api-access-2wvxd\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.001560 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.004700 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.005702 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-config-data\") pod \"ceilometer-0\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " pod="openstack/ceilometer-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.015348 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-db-sync-config-data\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.018510 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-scripts\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.020235 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.020982 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-combined-ca-bundle\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.021240 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-config-data\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.035723 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bmtlj" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.035951 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fhc\" (UniqueName: \"kubernetes.io/projected/3e5e221e-73c7-44a2-9af9-0feb60b412e0-kube-api-access-42fhc\") pod \"cinder-db-sync-c9j72\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.036240 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.036366 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.048595 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mvkm8"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.057513 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.059181 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.062336 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.062556 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.062735 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.062859 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v85jk" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.070337 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.074398 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-combined-ca-bundle\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.078285 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-scripts\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.078574 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-config-data\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.078748 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/27799696-4eb6-4ef9-9440-151a3929d699-horizon-secret-key\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.078811 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-config\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.078931 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9mp\" (UniqueName: \"kubernetes.io/projected/268be2d7-dd2e-42f0-b112-230de1abb1d4-kube-api-access-4t9mp\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.079265 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27799696-4eb6-4ef9-9440-151a3929d699-logs\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.079331 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv864\" (UniqueName: \"kubernetes.io/projected/27799696-4eb6-4ef9-9440-151a3929d699-kube-api-access-sv864\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.084104 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2lbtc"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.098909 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mcfl6"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.099924 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.106878 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.107105 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-q9nmm" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.107227 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.126422 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mcfl6"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.139252 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-bw8wq"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.162742 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-bw8wq"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.162875 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206385 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/27799696-4eb6-4ef9-9440-151a3929d699-horizon-secret-key\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206465 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-config\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206512 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206576 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7552z\" (UniqueName: \"kubernetes.io/projected/9fd2f789-f994-429e-8eb0-2c37a0108808-kube-api-access-7552z\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206624 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206672 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9mp\" (UniqueName: \"kubernetes.io/projected/268be2d7-dd2e-42f0-b112-230de1abb1d4-kube-api-access-4t9mp\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206707 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7lz\" (UniqueName: \"kubernetes.io/projected/21a39679-80b0-4a80-ad64-fe3707c2a9f0-kube-api-access-fk7lz\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206742 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27799696-4eb6-4ef9-9440-151a3929d699-logs\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206766 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-logs\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206789 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-config-data\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206823 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-combined-ca-bundle\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206863 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv864\" (UniqueName: \"kubernetes.io/projected/27799696-4eb6-4ef9-9440-151a3929d699-kube-api-access-sv864\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.206966 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-scripts\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.207008 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-combined-ca-bundle\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.207096 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.207138 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-scripts\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.207193 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.207230 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-config-data\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.207298 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-db-sync-config-data\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.208539 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27799696-4eb6-4ef9-9440-151a3929d699-logs\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.209769 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-scripts\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.210421 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-config-data\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.218508 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-combined-ca-bundle\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.218770 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-config\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.218792 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/27799696-4eb6-4ef9-9440-151a3929d699-horizon-secret-key\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.223803 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9mp\" (UniqueName: \"kubernetes.io/projected/268be2d7-dd2e-42f0-b112-230de1abb1d4-kube-api-access-4t9mp\") pod \"neutron-db-sync-br7kf\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.231409 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv864\" (UniqueName: \"kubernetes.io/projected/27799696-4eb6-4ef9-9440-151a3929d699-kube-api-access-sv864\") pod \"horizon-94db9c8bf-6qj27\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.288917 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.309642 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-config-data\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.309858 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-combined-ca-bundle\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.309981 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9f75\" (UniqueName: \"kubernetes.io/projected/baee8d66-1152-499a-9e04-1c58353c4651-kube-api-access-q9f75\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310087 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-config-data\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310174 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-config\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310287 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310365 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-scripts\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310454 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnpfp\" (UniqueName: \"kubernetes.io/projected/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-kube-api-access-rnpfp\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310564 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310830 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-combined-ca-bundle\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.310897 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-logs\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311053 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311214 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-db-sync-config-data\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311284 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311387 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311583 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311681 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-scripts\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311753 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7552z\" (UniqueName: \"kubernetes.io/projected/9fd2f789-f994-429e-8eb0-2c37a0108808-kube-api-access-7552z\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.311813 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.312957 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.313090 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7lz\" (UniqueName: \"kubernetes.io/projected/21a39679-80b0-4a80-ad64-fe3707c2a9f0-kube-api-access-fk7lz\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.313140 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.313182 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-logs\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.313746 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-logs\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.314004 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.316735 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-scripts\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.317383 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.318328 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-db-sync-config-data\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.318511 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-combined-ca-bundle\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.321967 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-config-data\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.322055 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.329252 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7lz\" (UniqueName: \"kubernetes.io/projected/21a39679-80b0-4a80-ad64-fe3707c2a9f0-kube-api-access-fk7lz\") pod \"barbican-db-sync-mvkm8\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.330759 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7552z\" (UniqueName: \"kubernetes.io/projected/9fd2f789-f994-429e-8eb0-2c37a0108808-kube-api-access-7552z\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.339730 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c9j72" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.347114 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.360275 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.386813 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-br7kf" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.401687 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415258 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415316 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnpfp\" (UniqueName: \"kubernetes.io/projected/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-kube-api-access-rnpfp\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415362 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-combined-ca-bundle\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415379 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-logs\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415415 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415437 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415473 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-scripts\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415505 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415530 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9f75\" (UniqueName: \"kubernetes.io/projected/baee8d66-1152-499a-9e04-1c58353c4651-kube-api-access-q9f75\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415547 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-config-data\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.415569 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-config\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.416383 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-config\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.416738 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.417209 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-logs\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.417227 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.417332 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.417452 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.417794 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.420621 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-combined-ca-bundle\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.422265 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-scripts\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.424191 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-config-data\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.434485 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnpfp\" (UniqueName: \"kubernetes.io/projected/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-kube-api-access-rnpfp\") pod \"placement-db-sync-mcfl6\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.434790 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9f75\" (UniqueName: \"kubernetes.io/projected/baee8d66-1152-499a-9e04-1c58353c4651-kube-api-access-q9f75\") pod \"dnsmasq-dns-ccd7c9f8f-bw8wq\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.508812 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.531756 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2lbtc"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.575057 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-29vzx"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.594558 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.608874 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6774d8fcc9-lpttg"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.734171 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mcfl6" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.797320 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.798964 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.801719 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.805557 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.805664 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.852312 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-c9j72"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.928992 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929073 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929134 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-logs\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929173 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929212 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929227 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929245 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lv5\" (UniqueName: \"kubernetes.io/projected/d086820b-63a2-481f-a349-1dad3879b659-kube-api-access-72lv5\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.929260 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.983327 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-br7kf"] Feb 28 09:19:53 crc kubenswrapper[4687]: I0228 09:19:53.994839 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-94db9c8bf-6qj27"] Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.004234 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mvkm8"] Feb 28 09:19:54 crc kubenswrapper[4687]: W0228 09:19:54.015477 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice/crio-9cb4ddb764f0c5a30b40d129e1c56024c04b6a19cb224b015cfc83c54194d2da WatchSource:0}: Error finding container 9cb4ddb764f0c5a30b40d129e1c56024c04b6a19cb224b015cfc83c54194d2da: Status 404 returned error can't find the container with id 9cb4ddb764f0c5a30b40d129e1c56024c04b6a19cb224b015cfc83c54194d2da Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033609 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033670 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033691 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033707 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lv5\" (UniqueName: \"kubernetes.io/projected/d086820b-63a2-481f-a349-1dad3879b659-kube-api-access-72lv5\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033725 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033766 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033800 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.033847 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-logs\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.034324 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-logs\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.036003 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.037095 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.040313 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.047975 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.050642 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.060492 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.060826 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lv5\" (UniqueName: \"kubernetes.io/projected/d086820b-63a2-481f-a349-1dad3879b659-kube-api-access-72lv5\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.078174 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.078573 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.160465 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-bw8wq"] Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.231695 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.249843 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mcfl6"] Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.319057 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvkm8" event={"ID":"21a39679-80b0-4a80-ad64-fe3707c2a9f0","Type":"ContainerStarted","Data":"f65eb8b258925b98bf73a7698d47c9bd18a95ae83349919a8e0172d797496532"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.326851 4687 generic.go:334] "Generic (PLEG): container finished" podID="588dbc79-5684-4058-8b12-0cad014a4cc4" containerID="b25b7540ef8c390a047f34b7d8ed71de7347d0dee32314138bb45e36ffc7dc6e" exitCode=0 Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.326917 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" event={"ID":"588dbc79-5684-4058-8b12-0cad014a4cc4","Type":"ContainerDied","Data":"b25b7540ef8c390a047f34b7d8ed71de7347d0dee32314138bb45e36ffc7dc6e"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.326935 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" event={"ID":"588dbc79-5684-4058-8b12-0cad014a4cc4","Type":"ContainerStarted","Data":"480fe509ef6d54afcf8333c265afe4ab4f7280a8eb575ac953678d694d5bc0e8"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.367064 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mcfl6" event={"ID":"ef1fa0a3-ab49-4807-a503-3a51a2b70e26","Type":"ContainerStarted","Data":"af5ef71ae62dd4647fb3bf15e5cca54837665fb3a538a4d8c05c340eaa099ec8"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.378319 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c9j72" event={"ID":"3e5e221e-73c7-44a2-9af9-0feb60b412e0","Type":"ContainerStarted","Data":"2208c8fed1e88f5b5b0ba488bcffee0b225598f3f7537481f3ae92ba150a8d1d"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.381796 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-br7kf" event={"ID":"268be2d7-dd2e-42f0-b112-230de1abb1d4","Type":"ContainerStarted","Data":"ffaa6edabe26e3187e8637f65143ca2be45488d26ff58086f7d34291009d8496"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.381922 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-br7kf" event={"ID":"268be2d7-dd2e-42f0-b112-230de1abb1d4","Type":"ContainerStarted","Data":"6d4e7fa7da39fb9bd49b856d6ef041a63130a67af56f1a515d52b3a99f475ebf"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.392982 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6774d8fcc9-lpttg" event={"ID":"76f683cb-cc38-4cdd-a0f0-1077410b1768","Type":"ContainerStarted","Data":"e8d5b812f2edc197ed1fa8b0a0914b0152c058afe4646764eeeefcfa6ffe9e43"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.395707 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-29vzx" event={"ID":"565a264d-399a-47d9-8273-b8ca22fdc8b6","Type":"ContainerStarted","Data":"fe9ba56e9608511d072698b0a6f39183abd2a7895b689c86907310814b551612"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.395734 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-29vzx" event={"ID":"565a264d-399a-47d9-8273-b8ca22fdc8b6","Type":"ContainerStarted","Data":"114dd4a604c871beb8ae5ad7dfcf96747bc6b8498677fe685cddfe38832c7f78"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.409513 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-br7kf" podStartSLOduration=2.409492661 podStartE2EDuration="2.409492661s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:54.404649612 +0000 UTC m=+986.095218950" watchObservedRunningTime="2026-02-28 09:19:54.409492661 +0000 UTC m=+986.100061989" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.413183 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9fd2f789-f994-429e-8eb0-2c37a0108808","Type":"ContainerStarted","Data":"4aac96c1e6dd323e54dd8877ee53a408050fc2a154717e491f1f83d23d783ff4"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.417363 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94db9c8bf-6qj27" event={"ID":"27799696-4eb6-4ef9-9440-151a3929d699","Type":"ContainerStarted","Data":"9cb4ddb764f0c5a30b40d129e1c56024c04b6a19cb224b015cfc83c54194d2da"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.422414 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" event={"ID":"baee8d66-1152-499a-9e04-1c58353c4651","Type":"ContainerStarted","Data":"02c68db94f383f8a0fd02f1ccdb13fd58afdf08441908981d131fe71a1cc72d7"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.429400 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerStarted","Data":"b83edc249187f94706cb88fa7b442c63cc2c247afe76eefd355ca88641fe4c06"} Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.525991 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-29vzx" podStartSLOduration=2.52596468 podStartE2EDuration="2.52596468s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:54.429697895 +0000 UTC m=+986.120267242" watchObservedRunningTime="2026-02-28 09:19:54.52596468 +0000 UTC m=+986.216534017" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.707806 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.905592 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-svc\") pod \"588dbc79-5684-4058-8b12-0cad014a4cc4\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.905886 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-sb\") pod \"588dbc79-5684-4058-8b12-0cad014a4cc4\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.905922 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-config\") pod \"588dbc79-5684-4058-8b12-0cad014a4cc4\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.906014 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-swift-storage-0\") pod \"588dbc79-5684-4058-8b12-0cad014a4cc4\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.906145 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2j9\" (UniqueName: \"kubernetes.io/projected/588dbc79-5684-4058-8b12-0cad014a4cc4-kube-api-access-xh2j9\") pod \"588dbc79-5684-4058-8b12-0cad014a4cc4\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.906172 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-nb\") pod \"588dbc79-5684-4058-8b12-0cad014a4cc4\" (UID: \"588dbc79-5684-4058-8b12-0cad014a4cc4\") " Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.913172 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588dbc79-5684-4058-8b12-0cad014a4cc4-kube-api-access-xh2j9" (OuterVolumeSpecName: "kube-api-access-xh2j9") pod "588dbc79-5684-4058-8b12-0cad014a4cc4" (UID: "588dbc79-5684-4058-8b12-0cad014a4cc4"). InnerVolumeSpecName "kube-api-access-xh2j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.930924 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "588dbc79-5684-4058-8b12-0cad014a4cc4" (UID: "588dbc79-5684-4058-8b12-0cad014a4cc4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.940785 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "588dbc79-5684-4058-8b12-0cad014a4cc4" (UID: "588dbc79-5684-4058-8b12-0cad014a4cc4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.944746 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-config" (OuterVolumeSpecName: "config") pod "588dbc79-5684-4058-8b12-0cad014a4cc4" (UID: "588dbc79-5684-4058-8b12-0cad014a4cc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.954063 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "588dbc79-5684-4058-8b12-0cad014a4cc4" (UID: "588dbc79-5684-4058-8b12-0cad014a4cc4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.954516 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "588dbc79-5684-4058-8b12-0cad014a4cc4" (UID: "588dbc79-5684-4058-8b12-0cad014a4cc4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:19:54 crc kubenswrapper[4687]: I0228 09:19:54.976471 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:19:54 crc kubenswrapper[4687]: W0228 09:19:54.987795 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd086820b_63a2_481f_a349_1dad3879b659.slice/crio-a9c3d266eda92ea5486216c3adf2811950472a32ce807a12bee1be0edb0657cb WatchSource:0}: Error finding container a9c3d266eda92ea5486216c3adf2811950472a32ce807a12bee1be0edb0657cb: Status 404 returned error can't find the container with id a9c3d266eda92ea5486216c3adf2811950472a32ce807a12bee1be0edb0657cb Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.002098 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.002190 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.002258 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.002994 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f16534f65e44ed5dcb5a741301bfadba47516c592259f18b72f5912611ebb09f"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.003081 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://f16534f65e44ed5dcb5a741301bfadba47516c592259f18b72f5912611ebb09f" gracePeriod=600 Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.011478 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.011526 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.011539 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.011552 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.011562 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2j9\" (UniqueName: \"kubernetes.io/projected/588dbc79-5684-4058-8b12-0cad014a4cc4-kube-api-access-xh2j9\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.011573 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/588dbc79-5684-4058-8b12-0cad014a4cc4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.451202 4687 generic.go:334] "Generic (PLEG): container finished" podID="baee8d66-1152-499a-9e04-1c58353c4651" containerID="cf77963f4f0d9b79eeebfb11e4dee3a877ce783225e125d32a8afd5304756876" exitCode=0 Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.451401 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" event={"ID":"baee8d66-1152-499a-9e04-1c58353c4651","Type":"ContainerDied","Data":"cf77963f4f0d9b79eeebfb11e4dee3a877ce783225e125d32a8afd5304756876"} Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.455970 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d086820b-63a2-481f-a349-1dad3879b659","Type":"ContainerStarted","Data":"a9c3d266eda92ea5486216c3adf2811950472a32ce807a12bee1be0edb0657cb"} Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.466137 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="f16534f65e44ed5dcb5a741301bfadba47516c592259f18b72f5912611ebb09f" exitCode=0 Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.466207 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"f16534f65e44ed5dcb5a741301bfadba47516c592259f18b72f5912611ebb09f"} Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.466235 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"70e6449ca6d918497ca91c82bcac17a1011e8ea5698b1bdf893e712bee9903d3"} Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.466254 4687 scope.go:117] "RemoveContainer" containerID="e2099836a5e3e90d046dbb8521988fee6933b3b356479c2ff7510ccbe5caaedf" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.479117 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" event={"ID":"588dbc79-5684-4058-8b12-0cad014a4cc4","Type":"ContainerDied","Data":"480fe509ef6d54afcf8333c265afe4ab4f7280a8eb575ac953678d694d5bc0e8"} Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.479214 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2lbtc" Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.488004 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9fd2f789-f994-429e-8eb0-2c37a0108808","Type":"ContainerStarted","Data":"00100e0fbea373e59576c62f6e38c68d85a8282e212320e6cae828594dd164cc"} Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.553077 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2lbtc"] Feb 28 09:19:55 crc kubenswrapper[4687]: I0228 09:19:55.559421 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2lbtc"] Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.034245 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.064871 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6774d8fcc9-lpttg"] Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.085457 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9b795df4f-65xfj"] Feb 28 09:19:56 crc kubenswrapper[4687]: E0228 09:19:56.085873 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588dbc79-5684-4058-8b12-0cad014a4cc4" containerName="init" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.085893 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="588dbc79-5684-4058-8b12-0cad014a4cc4" containerName="init" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.089218 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="588dbc79-5684-4058-8b12-0cad014a4cc4" containerName="init" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.090124 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.096817 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.117781 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9b795df4f-65xfj"] Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.141606 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c40408-c638-4bea-86d5-fb40a60b6975-logs\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.141722 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84c40408-c638-4bea-86d5-fb40a60b6975-horizon-secret-key\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.141753 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-scripts\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.141771 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-config-data\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.141800 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4jsm\" (UniqueName: \"kubernetes.io/projected/84c40408-c638-4bea-86d5-fb40a60b6975-kube-api-access-h4jsm\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.168162 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.196797 4687 scope.go:117] "RemoveContainer" containerID="b25b7540ef8c390a047f34b7d8ed71de7347d0dee32314138bb45e36ffc7dc6e" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.242324 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84c40408-c638-4bea-86d5-fb40a60b6975-horizon-secret-key\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.242374 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-scripts\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.242399 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-config-data\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.242431 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4jsm\" (UniqueName: \"kubernetes.io/projected/84c40408-c638-4bea-86d5-fb40a60b6975-kube-api-access-h4jsm\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.242463 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c40408-c638-4bea-86d5-fb40a60b6975-logs\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.242879 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c40408-c638-4bea-86d5-fb40a60b6975-logs\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.243438 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-scripts\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.244389 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-config-data\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.248428 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84c40408-c638-4bea-86d5-fb40a60b6975-horizon-secret-key\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.269526 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4jsm\" (UniqueName: \"kubernetes.io/projected/84c40408-c638-4bea-86d5-fb40a60b6975-kube-api-access-h4jsm\") pod \"horizon-9b795df4f-65xfj\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.408729 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.524340 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9fd2f789-f994-429e-8eb0-2c37a0108808","Type":"ContainerStarted","Data":"2181b6c1937d7e3ed1a60397e6a7da406766826b8be513e8af25179005bec7ff"} Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.524539 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-log" containerID="cri-o://00100e0fbea373e59576c62f6e38c68d85a8282e212320e6cae828594dd164cc" gracePeriod=30 Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.525433 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-httpd" containerID="cri-o://2181b6c1937d7e3ed1a60397e6a7da406766826b8be513e8af25179005bec7ff" gracePeriod=30 Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.531508 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" event={"ID":"baee8d66-1152-499a-9e04-1c58353c4651","Type":"ContainerStarted","Data":"505a526a74270e82d4537e98fc87928deb20bfb6ac074cbc2be0f77b8932a155"} Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.531612 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.533481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d086820b-63a2-481f-a349-1dad3879b659","Type":"ContainerStarted","Data":"9b75cdf75a917e7e4d9797cc55dbe0772ac792d304b57a4b110c886a675867d0"} Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.550380 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.550363848 podStartE2EDuration="4.550363848s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:56.539080305 +0000 UTC m=+988.229649642" watchObservedRunningTime="2026-02-28 09:19:56.550363848 +0000 UTC m=+988.240933185" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.564555 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" podStartSLOduration=3.564515165 podStartE2EDuration="3.564515165s" podCreationTimestamp="2026-02-28 09:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:56.564325668 +0000 UTC m=+988.254895004" watchObservedRunningTime="2026-02-28 09:19:56.564515165 +0000 UTC m=+988.255084502" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.671220 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588dbc79-5684-4058-8b12-0cad014a4cc4" path="/var/lib/kubelet/pods/588dbc79-5684-4058-8b12-0cad014a4cc4/volumes" Feb 28 09:19:56 crc kubenswrapper[4687]: I0228 09:19:56.975424 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9b795df4f-65xfj"] Feb 28 09:19:56 crc kubenswrapper[4687]: W0228 09:19:56.984217 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice/crio-0a0a86b425d00964404e37a811d5b05c915d79a2bfeb451d659a2a38fec5dd2f WatchSource:0}: Error finding container 0a0a86b425d00964404e37a811d5b05c915d79a2bfeb451d659a2a38fec5dd2f: Status 404 returned error can't find the container with id 0a0a86b425d00964404e37a811d5b05c915d79a2bfeb451d659a2a38fec5dd2f Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.551293 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b795df4f-65xfj" event={"ID":"84c40408-c638-4bea-86d5-fb40a60b6975","Type":"ContainerStarted","Data":"0a0a86b425d00964404e37a811d5b05c915d79a2bfeb451d659a2a38fec5dd2f"} Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.559961 4687 generic.go:334] "Generic (PLEG): container finished" podID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerID="2181b6c1937d7e3ed1a60397e6a7da406766826b8be513e8af25179005bec7ff" exitCode=0 Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.559998 4687 generic.go:334] "Generic (PLEG): container finished" podID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerID="00100e0fbea373e59576c62f6e38c68d85a8282e212320e6cae828594dd164cc" exitCode=143 Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.560038 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9fd2f789-f994-429e-8eb0-2c37a0108808","Type":"ContainerDied","Data":"2181b6c1937d7e3ed1a60397e6a7da406766826b8be513e8af25179005bec7ff"} Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.560091 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9fd2f789-f994-429e-8eb0-2c37a0108808","Type":"ContainerDied","Data":"00100e0fbea373e59576c62f6e38c68d85a8282e212320e6cae828594dd164cc"} Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.562473 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d086820b-63a2-481f-a349-1dad3879b659","Type":"ContainerStarted","Data":"778b96cc88ff91771173ff240f0837a50b20e9098afa8f37822ee48240c110c3"} Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.562601 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-log" containerID="cri-o://9b75cdf75a917e7e4d9797cc55dbe0772ac792d304b57a4b110c886a675867d0" gracePeriod=30 Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.563209 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-httpd" containerID="cri-o://778b96cc88ff91771173ff240f0837a50b20e9098afa8f37822ee48240c110c3" gracePeriod=30 Feb 28 09:19:57 crc kubenswrapper[4687]: I0228 09:19:57.584057 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.584012053 podStartE2EDuration="5.584012053s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:19:57.583221026 +0000 UTC m=+989.273790383" watchObservedRunningTime="2026-02-28 09:19:57.584012053 +0000 UTC m=+989.274581391" Feb 28 09:19:58 crc kubenswrapper[4687]: I0228 09:19:58.582969 4687 generic.go:334] "Generic (PLEG): container finished" podID="d086820b-63a2-481f-a349-1dad3879b659" containerID="778b96cc88ff91771173ff240f0837a50b20e9098afa8f37822ee48240c110c3" exitCode=0 Feb 28 09:19:58 crc kubenswrapper[4687]: I0228 09:19:58.583501 4687 generic.go:334] "Generic (PLEG): container finished" podID="d086820b-63a2-481f-a349-1dad3879b659" containerID="9b75cdf75a917e7e4d9797cc55dbe0772ac792d304b57a4b110c886a675867d0" exitCode=143 Feb 28 09:19:58 crc kubenswrapper[4687]: I0228 09:19:58.583550 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d086820b-63a2-481f-a349-1dad3879b659","Type":"ContainerDied","Data":"778b96cc88ff91771173ff240f0837a50b20e9098afa8f37822ee48240c110c3"} Feb 28 09:19:58 crc kubenswrapper[4687]: I0228 09:19:58.583580 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d086820b-63a2-481f-a349-1dad3879b659","Type":"ContainerDied","Data":"9b75cdf75a917e7e4d9797cc55dbe0772ac792d304b57a4b110c886a675867d0"} Feb 28 09:19:58 crc kubenswrapper[4687]: I0228 09:19:58.585825 4687 generic.go:334] "Generic (PLEG): container finished" podID="565a264d-399a-47d9-8273-b8ca22fdc8b6" containerID="fe9ba56e9608511d072698b0a6f39183abd2a7895b689c86907310814b551612" exitCode=0 Feb 28 09:19:58 crc kubenswrapper[4687]: I0228 09:19:58.585870 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-29vzx" event={"ID":"565a264d-399a-47d9-8273-b8ca22fdc8b6","Type":"ContainerDied","Data":"fe9ba56e9608511d072698b0a6f39183abd2a7895b689c86907310814b551612"} Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.154400 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537840-2wf4q"] Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.156093 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.158218 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.158232 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.158312 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.166155 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537840-2wf4q"] Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.335874 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl42m\" (UniqueName: \"kubernetes.io/projected/694d7626-7d52-4f55-a8c3-79feaec0e5e2-kube-api-access-fl42m\") pod \"auto-csr-approver-29537840-2wf4q\" (UID: \"694d7626-7d52-4f55-a8c3-79feaec0e5e2\") " pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.436873 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl42m\" (UniqueName: \"kubernetes.io/projected/694d7626-7d52-4f55-a8c3-79feaec0e5e2-kube-api-access-fl42m\") pod \"auto-csr-approver-29537840-2wf4q\" (UID: \"694d7626-7d52-4f55-a8c3-79feaec0e5e2\") " pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.466485 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl42m\" (UniqueName: \"kubernetes.io/projected/694d7626-7d52-4f55-a8c3-79feaec0e5e2-kube-api-access-fl42m\") pod \"auto-csr-approver-29537840-2wf4q\" (UID: \"694d7626-7d52-4f55-a8c3-79feaec0e5e2\") " pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:00 crc kubenswrapper[4687]: I0228 09:20:00.483392 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.615925 4687 generic.go:334] "Generic (PLEG): container finished" podID="268be2d7-dd2e-42f0-b112-230de1abb1d4" containerID="ffaa6edabe26e3187e8637f65143ca2be45488d26ff58086f7d34291009d8496" exitCode=0 Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.616007 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-br7kf" event={"ID":"268be2d7-dd2e-42f0-b112-230de1abb1d4","Type":"ContainerDied","Data":"ffaa6edabe26e3187e8637f65143ca2be45488d26ff58086f7d34291009d8496"} Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.706128 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-94db9c8bf-6qj27"] Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.740931 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d58956cb6-f8plp"] Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.743163 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.748498 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.758800 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d58956cb6-f8plp"] Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.821126 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9b795df4f-65xfj"] Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.861589 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b9587f844-jq5pd"] Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.864840 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.872895 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b9587f844-jq5pd"] Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875578 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-config-data\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875614 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-tls-certs\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875668 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06887c-91c5-43bb-8631-53fac29e79b6-logs\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875713 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24bp\" (UniqueName: \"kubernetes.io/projected/6a06887c-91c5-43bb-8631-53fac29e79b6-kube-api-access-n24bp\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875737 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-secret-key\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875800 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-scripts\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.875841 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-combined-ca-bundle\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.979577 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06887c-91c5-43bb-8631-53fac29e79b6-logs\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.979626 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjqs\" (UniqueName: \"kubernetes.io/projected/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-kube-api-access-8cjqs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.979720 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n24bp\" (UniqueName: \"kubernetes.io/projected/6a06887c-91c5-43bb-8631-53fac29e79b6-kube-api-access-n24bp\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.979752 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-horizon-tls-certs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.979779 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-secret-key\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.979810 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-scripts\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.980211 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06887c-91c5-43bb-8631-53fac29e79b6-logs\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.982794 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-scripts\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.982842 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-logs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.982864 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-horizon-secret-key\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.982960 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-combined-ca-bundle\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.982987 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-combined-ca-bundle\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.983042 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-config-data\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.983095 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-config-data\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.983112 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-tls-certs\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.983607 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-scripts\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.984247 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-config-data\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.985996 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-secret-key\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.986643 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-tls-certs\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:01 crc kubenswrapper[4687]: I0228 09:20:01.992835 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24bp\" (UniqueName: \"kubernetes.io/projected/6a06887c-91c5-43bb-8631-53fac29e79b6-kube-api-access-n24bp\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.005431 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-combined-ca-bundle\") pod \"horizon-5d58956cb6-f8plp\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.076192 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.084776 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjqs\" (UniqueName: \"kubernetes.io/projected/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-kube-api-access-8cjqs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085222 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-horizon-tls-certs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085285 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-scripts\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085461 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-logs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085492 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-horizon-secret-key\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085570 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-combined-ca-bundle\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085619 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-config-data\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.085907 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-logs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.086012 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-scripts\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.088566 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-config-data\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.089375 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-horizon-tls-certs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.090250 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-combined-ca-bundle\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.090371 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-horizon-secret-key\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.100080 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjqs\" (UniqueName: \"kubernetes.io/projected/113841cd-f813-4ee0-93cf-2e3cfb43f6fc-kube-api-access-8cjqs\") pod \"horizon-b9587f844-jq5pd\" (UID: \"113841cd-f813-4ee0-93cf-2e3cfb43f6fc\") " pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:02 crc kubenswrapper[4687]: I0228 09:20:02.184156 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:03 crc kubenswrapper[4687]: I0228 09:20:03.512293 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:20:03 crc kubenswrapper[4687]: I0228 09:20:03.565593 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-2zs2p"] Feb 28 09:20:03 crc kubenswrapper[4687]: I0228 09:20:03.566062 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="dnsmasq-dns" containerID="cri-o://c71ddd519cc0345f9d2e74444dbba50f32616af11432deeaaec79043832ee2de" gracePeriod=10 Feb 28 09:20:04 crc kubenswrapper[4687]: I0228 09:20:04.652471 4687 generic.go:334] "Generic (PLEG): container finished" podID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerID="c71ddd519cc0345f9d2e74444dbba50f32616af11432deeaaec79043832ee2de" exitCode=0 Feb 28 09:20:04 crc kubenswrapper[4687]: I0228 09:20:04.652632 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" event={"ID":"878defc9-19d4-48ce-92c3-9b0976de28d2","Type":"ContainerDied","Data":"c71ddd519cc0345f9d2e74444dbba50f32616af11432deeaaec79043832ee2de"} Feb 28 09:20:07 crc kubenswrapper[4687]: I0228 09:20:07.699811 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.384948 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.533599 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-scripts\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.533669 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-httpd-run\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.533816 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-combined-ca-bundle\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.533860 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-logs\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.533928 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7552z\" (UniqueName: \"kubernetes.io/projected/9fd2f789-f994-429e-8eb0-2c37a0108808-kube-api-access-7552z\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.533958 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.534001 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-config-data\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.534186 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-public-tls-certs\") pod \"9fd2f789-f994-429e-8eb0-2c37a0108808\" (UID: \"9fd2f789-f994-429e-8eb0-2c37a0108808\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.534431 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-logs" (OuterVolumeSpecName: "logs") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.534419 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.537672 4687 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.537712 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fd2f789-f994-429e-8eb0-2c37a0108808-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.540875 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.542050 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-scripts" (OuterVolumeSpecName: "scripts") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.545836 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd2f789-f994-429e-8eb0-2c37a0108808-kube-api-access-7552z" (OuterVolumeSpecName: "kube-api-access-7552z") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "kube-api-access-7552z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.567068 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.594558 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-config-data" (OuterVolumeSpecName: "config-data") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.605504 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9fd2f789-f994-429e-8eb0-2c37a0108808" (UID: "9fd2f789-f994-429e-8eb0-2c37a0108808"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.640710 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.640744 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7552z\" (UniqueName: \"kubernetes.io/projected/9fd2f789-f994-429e-8eb0-2c37a0108808-kube-api-access-7552z\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.640787 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.640800 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.640815 4687 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.640825 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd2f789-f994-429e-8eb0-2c37a0108808-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.660382 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.690057 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9fd2f789-f994-429e-8eb0-2c37a0108808","Type":"ContainerDied","Data":"4aac96c1e6dd323e54dd8877ee53a408050fc2a154717e491f1f83d23d783ff4"} Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.690125 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.690151 4687 scope.go:117] "RemoveContainer" containerID="2181b6c1937d7e3ed1a60397e6a7da406766826b8be513e8af25179005bec7ff" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.720520 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.735675 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.744107 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.747856 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:20:08 crc kubenswrapper[4687]: E0228 09:20:08.748342 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-log" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.748363 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-log" Feb 28 09:20:08 crc kubenswrapper[4687]: E0228 09:20:08.748375 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-httpd" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.748896 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-httpd" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.749150 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-httpd" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.749167 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" containerName="glance-log" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.750119 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.752258 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.752829 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.754160 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:20:08 crc kubenswrapper[4687]: E0228 09:20:08.770547 4687 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81" Feb 28 09:20:08 crc kubenswrapper[4687]: E0228 09:20:08.771305 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59ch68h678h5bch9hd8h677h5d4h56bh5f7h5d9h5fdh578h58h65fh59fh675h65ch96hbch556hbh654h685hf4hf9h64fh54dh68fh66fhcdh6fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wvxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(2a0893a8-0386-4d6d-9476-c061c3fb5f3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.793656 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-br7kf" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.793659 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.812621 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957163 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-httpd-run\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957206 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-credential-keys\") pod \"565a264d-399a-47d9-8273-b8ca22fdc8b6\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957229 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-fernet-keys\") pod \"565a264d-399a-47d9-8273-b8ca22fdc8b6\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957250 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-combined-ca-bundle\") pod \"565a264d-399a-47d9-8273-b8ca22fdc8b6\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957277 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-internal-tls-certs\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957304 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-config\") pod \"268be2d7-dd2e-42f0-b112-230de1abb1d4\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957332 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-config-data\") pod \"565a264d-399a-47d9-8273-b8ca22fdc8b6\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957348 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zph6k\" (UniqueName: \"kubernetes.io/projected/565a264d-399a-47d9-8273-b8ca22fdc8b6-kube-api-access-zph6k\") pod \"565a264d-399a-47d9-8273-b8ca22fdc8b6\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957371 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-config-data\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957389 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-scripts\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957407 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-logs\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957426 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-combined-ca-bundle\") pod \"268be2d7-dd2e-42f0-b112-230de1abb1d4\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957445 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-scripts\") pod \"565a264d-399a-47d9-8273-b8ca22fdc8b6\" (UID: \"565a264d-399a-47d9-8273-b8ca22fdc8b6\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957460 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957480 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72lv5\" (UniqueName: \"kubernetes.io/projected/d086820b-63a2-481f-a349-1dad3879b659-kube-api-access-72lv5\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957503 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t9mp\" (UniqueName: \"kubernetes.io/projected/268be2d7-dd2e-42f0-b112-230de1abb1d4-kube-api-access-4t9mp\") pod \"268be2d7-dd2e-42f0-b112-230de1abb1d4\" (UID: \"268be2d7-dd2e-42f0-b112-230de1abb1d4\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957534 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-combined-ca-bundle\") pod \"d086820b-63a2-481f-a349-1dad3879b659\" (UID: \"d086820b-63a2-481f-a349-1dad3879b659\") " Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957650 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957682 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957701 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957720 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957741 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957755 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cwm\" (UniqueName: \"kubernetes.io/projected/cdfc7644-a187-4fe9-8067-fa474114c1a1-kube-api-access-b9cwm\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957785 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.957840 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.958311 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:08 crc kubenswrapper[4687]: I0228 09:20:08.968808 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-logs" (OuterVolumeSpecName: "logs") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.006450 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-scripts" (OuterVolumeSpecName: "scripts") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.018135 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "565a264d-399a-47d9-8273-b8ca22fdc8b6" (UID: "565a264d-399a-47d9-8273-b8ca22fdc8b6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.018196 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565a264d-399a-47d9-8273-b8ca22fdc8b6-kube-api-access-zph6k" (OuterVolumeSpecName: "kube-api-access-zph6k") pod "565a264d-399a-47d9-8273-b8ca22fdc8b6" (UID: "565a264d-399a-47d9-8273-b8ca22fdc8b6"). InnerVolumeSpecName "kube-api-access-zph6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.018304 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "565a264d-399a-47d9-8273-b8ca22fdc8b6" (UID: "565a264d-399a-47d9-8273-b8ca22fdc8b6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.018363 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-scripts" (OuterVolumeSpecName: "scripts") pod "565a264d-399a-47d9-8273-b8ca22fdc8b6" (UID: "565a264d-399a-47d9-8273-b8ca22fdc8b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.018498 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-config-data" (OuterVolumeSpecName: "config-data") pod "565a264d-399a-47d9-8273-b8ca22fdc8b6" (UID: "565a264d-399a-47d9-8273-b8ca22fdc8b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.023556 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.024562 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d086820b-63a2-481f-a349-1dad3879b659-kube-api-access-72lv5" (OuterVolumeSpecName: "kube-api-access-72lv5") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "kube-api-access-72lv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.025196 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268be2d7-dd2e-42f0-b112-230de1abb1d4-kube-api-access-4t9mp" (OuterVolumeSpecName: "kube-api-access-4t9mp") pod "268be2d7-dd2e-42f0-b112-230de1abb1d4" (UID: "268be2d7-dd2e-42f0-b112-230de1abb1d4"). InnerVolumeSpecName "kube-api-access-4t9mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.060833 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.060918 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.060967 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.060996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061015 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061061 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061082 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061098 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cwm\" (UniqueName: \"kubernetes.io/projected/cdfc7644-a187-4fe9-8067-fa474114c1a1-kube-api-access-b9cwm\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061143 4687 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061153 4687 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061163 4687 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061170 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061178 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zph6k\" (UniqueName: \"kubernetes.io/projected/565a264d-399a-47d9-8273-b8ca22fdc8b6-kube-api-access-zph6k\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061186 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061195 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d086820b-63a2-481f-a349-1dad3879b659-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061203 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061220 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061231 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72lv5\" (UniqueName: \"kubernetes.io/projected/d086820b-63a2-481f-a349-1dad3879b659-kube-api-access-72lv5\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061239 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t9mp\" (UniqueName: \"kubernetes.io/projected/268be2d7-dd2e-42f0-b112-230de1abb1d4-kube-api-access-4t9mp\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.061531 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.070437 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.070727 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.084477 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.085607 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.096150 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.125638 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.129489 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.140610 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cwm\" (UniqueName: \"kubernetes.io/projected/cdfc7644-a187-4fe9-8067-fa474114c1a1-kube-api-access-b9cwm\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.153479 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-config" (OuterVolumeSpecName: "config") pod "268be2d7-dd2e-42f0-b112-230de1abb1d4" (UID: "268be2d7-dd2e-42f0-b112-230de1abb1d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.155710 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "268be2d7-dd2e-42f0-b112-230de1abb1d4" (UID: "268be2d7-dd2e-42f0-b112-230de1abb1d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.156133 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-config-data" (OuterVolumeSpecName: "config-data") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.158185 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "565a264d-399a-47d9-8273-b8ca22fdc8b6" (UID: "565a264d-399a-47d9-8273-b8ca22fdc8b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.161071 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.162888 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.162925 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a264d-399a-47d9-8273-b8ca22fdc8b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.162936 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.162946 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.162954 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268be2d7-dd2e-42f0-b112-230de1abb1d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.162963 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.163586 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.168840 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d086820b-63a2-481f-a349-1dad3879b659" (UID: "d086820b-63a2-481f-a349-1dad3879b659"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.264576 4687 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d086820b-63a2-481f-a349-1dad3879b659-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.373618 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.699292 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-br7kf" event={"ID":"268be2d7-dd2e-42f0-b112-230de1abb1d4","Type":"ContainerDied","Data":"6d4e7fa7da39fb9bd49b856d6ef041a63130a67af56f1a515d52b3a99f475ebf"} Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.699321 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-br7kf" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.699335 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d4e7fa7da39fb9bd49b856d6ef041a63130a67af56f1a515d52b3a99f475ebf" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.701978 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d086820b-63a2-481f-a349-1dad3879b659","Type":"ContainerDied","Data":"a9c3d266eda92ea5486216c3adf2811950472a32ce807a12bee1be0edb0657cb"} Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.702050 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.704102 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-29vzx" event={"ID":"565a264d-399a-47d9-8273-b8ca22fdc8b6","Type":"ContainerDied","Data":"114dd4a604c871beb8ae5ad7dfcf96747bc6b8498677fe685cddfe38832c7f78"} Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.704143 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114dd4a604c871beb8ae5ad7dfcf96747bc6b8498677fe685cddfe38832c7f78" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.704191 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-29vzx" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.743258 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.763054 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.771879 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:20:09 crc kubenswrapper[4687]: E0228 09:20:09.772305 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-httpd" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772328 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-httpd" Feb 28 09:20:09 crc kubenswrapper[4687]: E0228 09:20:09.772356 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268be2d7-dd2e-42f0-b112-230de1abb1d4" containerName="neutron-db-sync" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772364 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="268be2d7-dd2e-42f0-b112-230de1abb1d4" containerName="neutron-db-sync" Feb 28 09:20:09 crc kubenswrapper[4687]: E0228 09:20:09.772387 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565a264d-399a-47d9-8273-b8ca22fdc8b6" containerName="keystone-bootstrap" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772394 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="565a264d-399a-47d9-8273-b8ca22fdc8b6" containerName="keystone-bootstrap" Feb 28 09:20:09 crc kubenswrapper[4687]: E0228 09:20:09.772428 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-log" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772436 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-log" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772633 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="565a264d-399a-47d9-8273-b8ca22fdc8b6" containerName="keystone-bootstrap" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772648 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-log" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772668 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="268be2d7-dd2e-42f0-b112-230de1abb1d4" containerName="neutron-db-sync" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.772681 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d086820b-63a2-481f-a349-1dad3879b659" containerName="glance-httpd" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.773667 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.776318 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.776549 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.776691 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.875986 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqlpt\" (UniqueName: \"kubernetes.io/projected/59ec19ad-b746-417b-a573-1b450746e794-kube-api-access-tqlpt\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876253 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876311 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-logs\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876357 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876378 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876668 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876806 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.876882 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.917898 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-29vzx"] Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.941876 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-29vzx"] Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.978773 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.978839 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.978872 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.978913 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqlpt\" (UniqueName: \"kubernetes.io/projected/59ec19ad-b746-417b-a573-1b450746e794-kube-api-access-tqlpt\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.978969 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.978992 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-logs\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.979013 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.979058 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.979057 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-v6s24"] Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.980890 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.982822 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.983318 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-logs\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.988110 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.988339 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.988533 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29qml" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.989166 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.989334 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.991121 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:09 crc kubenswrapper[4687]: I0228 09:20:09.996592 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.001374 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.003266 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.009716 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.017154 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqlpt\" (UniqueName: \"kubernetes.io/projected/59ec19ad-b746-417b-a573-1b450746e794-kube-api-access-tqlpt\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.040951 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v6s24"] Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.086228 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-crzs5"] Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.089529 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-fernet-keys\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.090675 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-scripts\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.091474 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6dt5\" (UniqueName: \"kubernetes.io/projected/2c8490bf-32fb-4d04-974d-b2ca311f4b55-kube-api-access-l6dt5\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.091539 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-config-data\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.091639 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-combined-ca-bundle\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.092261 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-credential-keys\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.101758 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.101845 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.172749 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-crzs5"] Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.190194 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b9c5b669b-xd8lz"] Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.192310 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.194978 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v7cv7" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.195235 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.195455 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.195923 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.197967 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-combined-ca-bundle\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.198090 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-credential-keys\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.198161 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-fernet-keys\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.198707 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-scripts\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.198742 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6dt5\" (UniqueName: \"kubernetes.io/projected/2c8490bf-32fb-4d04-974d-b2ca311f4b55-kube-api-access-l6dt5\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.198763 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-config-data\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.205923 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-scripts\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.205356 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-fernet-keys\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.213365 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-combined-ca-bundle\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.214140 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-credential-keys\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.214512 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b9c5b669b-xd8lz"] Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.215075 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-config-data\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.217531 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6dt5\" (UniqueName: \"kubernetes.io/projected/2c8490bf-32fb-4d04-974d-b2ca311f4b55-kube-api-access-l6dt5\") pod \"keystone-bootstrap-v6s24\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301207 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301354 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-svc\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301392 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301449 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfqj\" (UniqueName: \"kubernetes.io/projected/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-kube-api-access-zvfqj\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301477 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-httpd-config\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301503 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-config\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301570 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-config\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301732 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-combined-ca-bundle\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301764 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5ph\" (UniqueName: \"kubernetes.io/projected/0859ec96-842c-472a-be1b-f29c8f1df2d9-kube-api-access-qv5ph\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301793 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-ovndb-tls-certs\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.301910 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.374108 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.390935 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408090 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408224 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408272 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-svc\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408293 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408322 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvfqj\" (UniqueName: \"kubernetes.io/projected/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-kube-api-access-zvfqj\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408354 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-httpd-config\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408374 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-config\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408402 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-config\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408467 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-combined-ca-bundle\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408498 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5ph\" (UniqueName: \"kubernetes.io/projected/0859ec96-842c-472a-be1b-f29c8f1df2d9-kube-api-access-qv5ph\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.408522 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-ovndb-tls-certs\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.410482 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.411159 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.411908 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-svc\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.411987 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-config\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.412599 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.415532 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-ovndb-tls-certs\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.423236 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-combined-ca-bundle\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.425380 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-httpd-config\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.427414 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-config\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.428686 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvfqj\" (UniqueName: \"kubernetes.io/projected/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-kube-api-access-zvfqj\") pod \"dnsmasq-dns-7859c7799c-crzs5\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.430257 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5ph\" (UniqueName: \"kubernetes.io/projected/0859ec96-842c-472a-be1b-f29c8f1df2d9-kube-api-access-qv5ph\") pod \"neutron-7b9c5b669b-xd8lz\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.471756 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.525239 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.665481 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565a264d-399a-47d9-8273-b8ca22fdc8b6" path="/var/lib/kubelet/pods/565a264d-399a-47d9-8273-b8ca22fdc8b6/volumes" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.666134 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd2f789-f994-429e-8eb0-2c37a0108808" path="/var/lib/kubelet/pods/9fd2f789-f994-429e-8eb0-2c37a0108808/volumes" Feb 28 09:20:10 crc kubenswrapper[4687]: I0228 09:20:10.666935 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d086820b-63a2-481f-a349-1dad3879b659" path="/var/lib/kubelet/pods/d086820b-63a2-481f-a349-1dad3879b659/volumes" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.630138 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bd77ccf75-bqx56"] Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.632559 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.637288 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.638581 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.639740 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bd77ccf75-bqx56"] Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.761357 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-ovndb-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.762346 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-public-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.762507 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-internal-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.762996 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-config\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.763470 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-httpd-config\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.763564 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5969\" (UniqueName: \"kubernetes.io/projected/d655bdf4-33ab-45fa-b1e4-c37aede5609a-kube-api-access-r5969\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.763901 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-combined-ca-bundle\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867245 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-config\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867343 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-httpd-config\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867391 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5969\" (UniqueName: \"kubernetes.io/projected/d655bdf4-33ab-45fa-b1e4-c37aede5609a-kube-api-access-r5969\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867452 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-combined-ca-bundle\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867536 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-ovndb-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867737 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-public-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.867821 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-internal-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.877583 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-internal-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.877687 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-httpd-config\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.878516 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-ovndb-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.879922 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-public-tls-certs\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.880906 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-combined-ca-bundle\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.882073 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-config\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.884125 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5969\" (UniqueName: \"kubernetes.io/projected/d655bdf4-33ab-45fa-b1e4-c37aede5609a-kube-api-access-r5969\") pod \"neutron-5bd77ccf75-bqx56\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:12 crc kubenswrapper[4687]: I0228 09:20:12.950650 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:17 crc kubenswrapper[4687]: E0228 09:20:17.094214 4687 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Feb 28 09:20:17 crc kubenswrapper[4687]: E0228 09:20:17.094739 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fk7lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mvkm8_openstack(21a39679-80b0-4a80-ad64-fe3707c2a9f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 28 09:20:17 crc kubenswrapper[4687]: E0228 09:20:17.096477 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mvkm8" podUID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.214281 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.379622 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmwrs\" (UniqueName: \"kubernetes.io/projected/878defc9-19d4-48ce-92c3-9b0976de28d2-kube-api-access-gmwrs\") pod \"878defc9-19d4-48ce-92c3-9b0976de28d2\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.379698 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-sb\") pod \"878defc9-19d4-48ce-92c3-9b0976de28d2\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.379745 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-svc\") pod \"878defc9-19d4-48ce-92c3-9b0976de28d2\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.379830 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-config\") pod \"878defc9-19d4-48ce-92c3-9b0976de28d2\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.380078 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-swift-storage-0\") pod \"878defc9-19d4-48ce-92c3-9b0976de28d2\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.380173 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-nb\") pod \"878defc9-19d4-48ce-92c3-9b0976de28d2\" (UID: \"878defc9-19d4-48ce-92c3-9b0976de28d2\") " Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.385327 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/878defc9-19d4-48ce-92c3-9b0976de28d2-kube-api-access-gmwrs" (OuterVolumeSpecName: "kube-api-access-gmwrs") pod "878defc9-19d4-48ce-92c3-9b0976de28d2" (UID: "878defc9-19d4-48ce-92c3-9b0976de28d2"). InnerVolumeSpecName "kube-api-access-gmwrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.415358 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-config" (OuterVolumeSpecName: "config") pod "878defc9-19d4-48ce-92c3-9b0976de28d2" (UID: "878defc9-19d4-48ce-92c3-9b0976de28d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.416492 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "878defc9-19d4-48ce-92c3-9b0976de28d2" (UID: "878defc9-19d4-48ce-92c3-9b0976de28d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.421120 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "878defc9-19d4-48ce-92c3-9b0976de28d2" (UID: "878defc9-19d4-48ce-92c3-9b0976de28d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.425570 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "878defc9-19d4-48ce-92c3-9b0976de28d2" (UID: "878defc9-19d4-48ce-92c3-9b0976de28d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.430488 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "878defc9-19d4-48ce-92c3-9b0976de28d2" (UID: "878defc9-19d4-48ce-92c3-9b0976de28d2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.486600 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.486643 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmwrs\" (UniqueName: \"kubernetes.io/projected/878defc9-19d4-48ce-92c3-9b0976de28d2-kube-api-access-gmwrs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.486662 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.486678 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.486688 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.486700 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/878defc9-19d4-48ce-92c3-9b0976de28d2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.497744 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537840-2wf4q"] Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.699880 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.781479 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.781481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-2zs2p" event={"ID":"878defc9-19d4-48ce-92c3-9b0976de28d2","Type":"ContainerDied","Data":"01f6398dc076b759abccc0a91b0e944464727b20c9b9564fe5e5a3f0bb784533"} Feb 28 09:20:17 crc kubenswrapper[4687]: E0228 09:20:17.784925 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-mvkm8" podUID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.843541 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-2zs2p"] Feb 28 09:20:17 crc kubenswrapper[4687]: I0228 09:20:17.849473 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-2zs2p"] Feb 28 09:20:18 crc kubenswrapper[4687]: E0228 09:20:18.171382 4687 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Feb 28 09:20:18 crc kubenswrapper[4687]: E0228 09:20:18.171726 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42fhc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-c9j72_openstack(3e5e221e-73c7-44a2-9af9-0feb60b412e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 28 09:20:18 crc kubenswrapper[4687]: E0228 09:20:18.172993 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-c9j72" podUID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.202828 4687 scope.go:117] "RemoveContainer" containerID="00100e0fbea373e59576c62f6e38c68d85a8282e212320e6cae828594dd164cc" Feb 28 09:20:18 crc kubenswrapper[4687]: W0228 09:20:18.215475 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod694d7626_7d52_4f55_a8c3_79feaec0e5e2.slice/crio-df24fc2f26be7adfe59cf6c0dc1ed98329fe165675aded440a7c3ce41b821f45 WatchSource:0}: Error finding container df24fc2f26be7adfe59cf6c0dc1ed98329fe165675aded440a7c3ce41b821f45: Status 404 returned error can't find the container with id df24fc2f26be7adfe59cf6c0dc1ed98329fe165675aded440a7c3ce41b821f45 Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.603205 4687 scope.go:117] "RemoveContainer" containerID="778b96cc88ff91771173ff240f0837a50b20e9098afa8f37822ee48240c110c3" Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.612923 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b9587f844-jq5pd"] Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.680217 4687 scope.go:117] "RemoveContainer" containerID="9b75cdf75a917e7e4d9797cc55dbe0772ac792d304b57a4b110c886a675867d0" Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.703172 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" path="/var/lib/kubelet/pods/878defc9-19d4-48ce-92c3-9b0976de28d2/volumes" Feb 28 09:20:18 crc kubenswrapper[4687]: W0228 09:20:18.725574 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod113841cd_f813_4ee0_93cf_2e3cfb43f6fc.slice/crio-be6040ea342b3e9583cc5326be49a0f09e1b299d6eeb2f67aec9d294d77790aa WatchSource:0}: Error finding container be6040ea342b3e9583cc5326be49a0f09e1b299d6eeb2f67aec9d294d77790aa: Status 404 returned error can't find the container with id be6040ea342b3e9583cc5326be49a0f09e1b299d6eeb2f67aec9d294d77790aa Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.748370 4687 scope.go:117] "RemoveContainer" containerID="c71ddd519cc0345f9d2e74444dbba50f32616af11432deeaaec79043832ee2de" Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.797770 4687 scope.go:117] "RemoveContainer" containerID="ba776bd8962fb3d4ada9f524b6f2f914f53f7ad479b2afc4460a951623bcb5cb" Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.807768 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" event={"ID":"694d7626-7d52-4f55-a8c3-79feaec0e5e2","Type":"ContainerStarted","Data":"df24fc2f26be7adfe59cf6c0dc1ed98329fe165675aded440a7c3ce41b821f45"} Feb 28 09:20:18 crc kubenswrapper[4687]: I0228 09:20:18.812724 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9587f844-jq5pd" event={"ID":"113841cd-f813-4ee0-93cf-2e3cfb43f6fc","Type":"ContainerStarted","Data":"be6040ea342b3e9583cc5326be49a0f09e1b299d6eeb2f67aec9d294d77790aa"} Feb 28 09:20:18 crc kubenswrapper[4687]: E0228 09:20:18.824701 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-c9j72" podUID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" Feb 28 09:20:19 crc kubenswrapper[4687]: W0228 09:20:19.098366 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a06887c_91c5_43bb_8631_53fac29e79b6.slice/crio-73128560e01e97d3de44cb4de5cead387621b152131260aab0f010990e438d7d WatchSource:0}: Error finding container 73128560e01e97d3de44cb4de5cead387621b152131260aab0f010990e438d7d: Status 404 returned error can't find the container with id 73128560e01e97d3de44cb4de5cead387621b152131260aab0f010990e438d7d Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.099661 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d58956cb6-f8plp"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.186944 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.485628 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v6s24"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.509764 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-crzs5"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.602385 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bd77ccf75-bqx56"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.673047 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.766261 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b9c5b669b-xd8lz"] Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.859370 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerStarted","Data":"404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.880285 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9587f844-jq5pd" event={"ID":"113841cd-f813-4ee0-93cf-2e3cfb43f6fc","Type":"ContainerStarted","Data":"96ae7ad7f435d1064ddf365c64d085cc7e92f2a9b17fb512d7c0421d37352b32"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.880338 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9587f844-jq5pd" event={"ID":"113841cd-f813-4ee0-93cf-2e3cfb43f6fc","Type":"ContainerStarted","Data":"44ddd66f29600fe58d3ba97fd3da562aea7cb396d4ef94106187409639efb09f"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.885870 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v6s24" event={"ID":"2c8490bf-32fb-4d04-974d-b2ca311f4b55","Type":"ContainerStarted","Data":"f0aa72df824eac21ad8f9e9e794e77b6b2f6f4fc4b7353cb04b062168c72aef5"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.899238 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d58956cb6-f8plp" event={"ID":"6a06887c-91c5-43bb-8631-53fac29e79b6","Type":"ContainerStarted","Data":"72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.899271 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d58956cb6-f8plp" event={"ID":"6a06887c-91c5-43bb-8631-53fac29e79b6","Type":"ContainerStarted","Data":"73128560e01e97d3de44cb4de5cead387621b152131260aab0f010990e438d7d"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.904390 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfc7644-a187-4fe9-8067-fa474114c1a1","Type":"ContainerStarted","Data":"14390a5d012080e61bd12d53a120b0b67ebd8ff5a0deabc8deb56bd16cd47266"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.913569 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b9587f844-jq5pd" podStartSLOduration=18.913548692 podStartE2EDuration="18.913548692s" podCreationTimestamp="2026-02-28 09:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:19.899118891 +0000 UTC m=+1011.589688228" watchObservedRunningTime="2026-02-28 09:20:19.913548692 +0000 UTC m=+1011.604118029" Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.916211 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b795df4f-65xfj" event={"ID":"84c40408-c638-4bea-86d5-fb40a60b6975","Type":"ContainerStarted","Data":"b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.916267 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b795df4f-65xfj" event={"ID":"84c40408-c638-4bea-86d5-fb40a60b6975","Type":"ContainerStarted","Data":"61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.916438 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9b795df4f-65xfj" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon-log" containerID="cri-o://61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a" gracePeriod=30 Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.916530 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9b795df4f-65xfj" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon" containerID="cri-o://b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111" gracePeriod=30 Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.937455 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d58956cb6-f8plp" podStartSLOduration=18.937437905 podStartE2EDuration="18.937437905s" podCreationTimestamp="2026-02-28 09:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:19.929218675 +0000 UTC m=+1011.619788032" watchObservedRunningTime="2026-02-28 09:20:19.937437905 +0000 UTC m=+1011.628007242" Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.942181 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bd77ccf75-bqx56" event={"ID":"d655bdf4-33ab-45fa-b1e4-c37aede5609a","Type":"ContainerStarted","Data":"c148b3a169846cd28c277934cdaf8f10f03c20cf3471050301bc5785ff1c3420"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.954175 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59ec19ad-b746-417b-a573-1b450746e794","Type":"ContainerStarted","Data":"8006049fb60c346daf716735c999f799a9d932c3d6ca58c1ef84b3b4687ca796"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.956157 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" event={"ID":"455f8be2-a725-49fb-ba76-6f3e6c4cb34d","Type":"ContainerStarted","Data":"9ba3383f945d7b2472026c92c72afaf80f70e31989b5540c8090bf0e0bff0dcd"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.966189 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94db9c8bf-6qj27" event={"ID":"27799696-4eb6-4ef9-9440-151a3929d699","Type":"ContainerStarted","Data":"712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.966235 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94db9c8bf-6qj27" event={"ID":"27799696-4eb6-4ef9-9440-151a3929d699","Type":"ContainerStarted","Data":"aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.966235 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-94db9c8bf-6qj27" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon-log" containerID="cri-o://aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28" gracePeriod=30 Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.966347 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-94db9c8bf-6qj27" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon" containerID="cri-o://712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e" gracePeriod=30 Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.973707 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6774d8fcc9-lpttg" event={"ID":"76f683cb-cc38-4cdd-a0f0-1077410b1768","Type":"ContainerStarted","Data":"f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.973746 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6774d8fcc9-lpttg" event={"ID":"76f683cb-cc38-4cdd-a0f0-1077410b1768","Type":"ContainerStarted","Data":"1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.974704 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6774d8fcc9-lpttg" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon-log" containerID="cri-o://1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318" gracePeriod=30 Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.974796 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6774d8fcc9-lpttg" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon" containerID="cri-o://f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d" gracePeriod=30 Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.982295 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" event={"ID":"694d7626-7d52-4f55-a8c3-79feaec0e5e2","Type":"ContainerStarted","Data":"55cff91936e24c603712a05134d1af3e9e2eab28a8a118594290f9969c5201d8"} Feb 28 09:20:19 crc kubenswrapper[4687]: I0228 09:20:19.987094 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mcfl6" event={"ID":"ef1fa0a3-ab49-4807-a503-3a51a2b70e26","Type":"ContainerStarted","Data":"d9577ae41f570588fc2e1468a12933ca2e321fe2a49078f3670e73d6dc1d2931"} Feb 28 09:20:20 crc kubenswrapper[4687]: I0228 09:20:20.026784 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-94db9c8bf-6qj27" podStartSLOduration=3.838849804 podStartE2EDuration="28.026764426s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="2026-02-28 09:19:54.031471845 +0000 UTC m=+985.722041182" lastFinishedPulling="2026-02-28 09:20:18.219386467 +0000 UTC m=+1009.909955804" observedRunningTime="2026-02-28 09:20:20.013710773 +0000 UTC m=+1011.704280110" watchObservedRunningTime="2026-02-28 09:20:20.026764426 +0000 UTC m=+1011.717333763" Feb 28 09:20:20 crc kubenswrapper[4687]: I0228 09:20:20.032525 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9b795df4f-65xfj" podStartSLOduration=3.939179532 podStartE2EDuration="24.032509191s" podCreationTimestamp="2026-02-28 09:19:56 +0000 UTC" firstStartedPulling="2026-02-28 09:19:56.991101587 +0000 UTC m=+988.681670924" lastFinishedPulling="2026-02-28 09:20:17.084431246 +0000 UTC m=+1008.775000583" observedRunningTime="2026-02-28 09:20:19.964740757 +0000 UTC m=+1011.655310094" watchObservedRunningTime="2026-02-28 09:20:20.032509191 +0000 UTC m=+1011.723078528" Feb 28 09:20:20 crc kubenswrapper[4687]: I0228 09:20:20.045520 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" podStartSLOduration=19.0930011 podStartE2EDuration="20.045503712s" podCreationTimestamp="2026-02-28 09:20:00 +0000 UTC" firstStartedPulling="2026-02-28 09:20:18.219319711 +0000 UTC m=+1009.909889048" lastFinishedPulling="2026-02-28 09:20:19.171822323 +0000 UTC m=+1010.862391660" observedRunningTime="2026-02-28 09:20:20.034291303 +0000 UTC m=+1011.724860629" watchObservedRunningTime="2026-02-28 09:20:20.045503712 +0000 UTC m=+1011.736073049" Feb 28 09:20:20 crc kubenswrapper[4687]: I0228 09:20:20.057157 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mcfl6" podStartSLOduration=4.240799045 podStartE2EDuration="27.057142393s" podCreationTimestamp="2026-02-28 09:19:53 +0000 UTC" firstStartedPulling="2026-02-28 09:19:54.315303223 +0000 UTC m=+986.005872561" lastFinishedPulling="2026-02-28 09:20:17.131646572 +0000 UTC m=+1008.822215909" observedRunningTime="2026-02-28 09:20:20.046326851 +0000 UTC m=+1011.736896198" watchObservedRunningTime="2026-02-28 09:20:20.057142393 +0000 UTC m=+1011.747711729" Feb 28 09:20:20 crc kubenswrapper[4687]: I0228 09:20:20.073250 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6774d8fcc9-lpttg" podStartSLOduration=3.209756682 podStartE2EDuration="28.073231674s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="2026-02-28 09:19:53.625458316 +0000 UTC m=+985.316027654" lastFinishedPulling="2026-02-28 09:20:18.488933308 +0000 UTC m=+1010.179502646" observedRunningTime="2026-02-28 09:20:20.07092176 +0000 UTC m=+1011.761491097" watchObservedRunningTime="2026-02-28 09:20:20.073231674 +0000 UTC m=+1011.763801012" Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.003187 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v6s24" event={"ID":"2c8490bf-32fb-4d04-974d-b2ca311f4b55","Type":"ContainerStarted","Data":"1b40e63fdc515073d911fb94bcf36a2b6d45554c85eea334cfe3d4e5db74cfbc"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.006296 4687 generic.go:334] "Generic (PLEG): container finished" podID="694d7626-7d52-4f55-a8c3-79feaec0e5e2" containerID="55cff91936e24c603712a05134d1af3e9e2eab28a8a118594290f9969c5201d8" exitCode=0 Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.006355 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" event={"ID":"694d7626-7d52-4f55-a8c3-79feaec0e5e2","Type":"ContainerDied","Data":"55cff91936e24c603712a05134d1af3e9e2eab28a8a118594290f9969c5201d8"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.012248 4687 generic.go:334] "Generic (PLEG): container finished" podID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerID="626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64" exitCode=0 Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.012299 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" event={"ID":"455f8be2-a725-49fb-ba76-6f3e6c4cb34d","Type":"ContainerDied","Data":"626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.032220 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-v6s24" podStartSLOduration=12.03220644 podStartE2EDuration="12.03220644s" podCreationTimestamp="2026-02-28 09:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:21.016226043 +0000 UTC m=+1012.706795390" watchObservedRunningTime="2026-02-28 09:20:21.03220644 +0000 UTC m=+1012.722775766" Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.048711 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d58956cb6-f8plp" event={"ID":"6a06887c-91c5-43bb-8631-53fac29e79b6","Type":"ContainerStarted","Data":"57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.090318 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfc7644-a187-4fe9-8067-fa474114c1a1","Type":"ContainerStarted","Data":"18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.094117 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bd77ccf75-bqx56" event={"ID":"d655bdf4-33ab-45fa-b1e4-c37aede5609a","Type":"ContainerStarted","Data":"1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.094276 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bd77ccf75-bqx56" event={"ID":"d655bdf4-33ab-45fa-b1e4-c37aede5609a","Type":"ContainerStarted","Data":"51b219e86f3b0d6b4919b070002226d15fce4b8fe16494e79bab096be1e39e20"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.095562 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.098889 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9c5b669b-xd8lz" event={"ID":"0859ec96-842c-472a-be1b-f29c8f1df2d9","Type":"ContainerStarted","Data":"8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.098992 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9c5b669b-xd8lz" event={"ID":"0859ec96-842c-472a-be1b-f29c8f1df2d9","Type":"ContainerStarted","Data":"03244fa16f2b84c19f33830379f405964557d73d6134657ac158003cd9026866"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.099099 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9c5b669b-xd8lz" event={"ID":"0859ec96-842c-472a-be1b-f29c8f1df2d9","Type":"ContainerStarted","Data":"b95d74220f77d6ba675759f720999213e1d9f6762c8feec6b655b77a35bd9d13"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.100426 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.110371 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59ec19ad-b746-417b-a573-1b450746e794","Type":"ContainerStarted","Data":"028844f2d4127d97e4dcbbf0a6c2f4aa6f538feb591e1cd7ad283e048ad0153f"} Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.137534 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bd77ccf75-bqx56" podStartSLOduration=9.137519659 podStartE2EDuration="9.137519659s" podCreationTimestamp="2026-02-28 09:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:21.119693992 +0000 UTC m=+1012.810263339" watchObservedRunningTime="2026-02-28 09:20:21.137519659 +0000 UTC m=+1012.828088996" Feb 28 09:20:21 crc kubenswrapper[4687]: I0228 09:20:21.155603 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b9c5b669b-xd8lz" podStartSLOduration=11.15559091 podStartE2EDuration="11.15559091s" podCreationTimestamp="2026-02-28 09:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:21.136969294 +0000 UTC m=+1012.827538632" watchObservedRunningTime="2026-02-28 09:20:21.15559091 +0000 UTC m=+1012.846160246" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.077288 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.077610 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.123073 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfc7644-a187-4fe9-8067-fa474114c1a1","Type":"ContainerStarted","Data":"b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460"} Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.128331 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mcfl6" event={"ID":"ef1fa0a3-ab49-4807-a503-3a51a2b70e26","Type":"ContainerDied","Data":"d9577ae41f570588fc2e1468a12933ca2e321fe2a49078f3670e73d6dc1d2931"} Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.128247 4687 generic.go:334] "Generic (PLEG): container finished" podID="ef1fa0a3-ab49-4807-a503-3a51a2b70e26" containerID="d9577ae41f570588fc2e1468a12933ca2e321fe2a49078f3670e73d6dc1d2931" exitCode=0 Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.130102 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59ec19ad-b746-417b-a573-1b450746e794","Type":"ContainerStarted","Data":"8bd1539f05f84dff93650ce81fe1fb27a301643199250c07815c3f641b7b68d3"} Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.136750 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" event={"ID":"455f8be2-a725-49fb-ba76-6f3e6c4cb34d","Type":"ContainerStarted","Data":"99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd"} Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.136782 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.168815 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.168802396 podStartE2EDuration="14.168802396s" podCreationTimestamp="2026-02-28 09:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:22.148183986 +0000 UTC m=+1013.838753323" watchObservedRunningTime="2026-02-28 09:20:22.168802396 +0000 UTC m=+1013.859371734" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.184312 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.185396 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.201235 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.20122569 podStartE2EDuration="13.20122569s" podCreationTimestamp="2026-02-28 09:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:22.18529173 +0000 UTC m=+1013.875861068" watchObservedRunningTime="2026-02-28 09:20:22.20122569 +0000 UTC m=+1013.891795017" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.232122 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" podStartSLOduration=13.232097306 podStartE2EDuration="13.232097306s" podCreationTimestamp="2026-02-28 09:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:22.224388386 +0000 UTC m=+1013.914957733" watchObservedRunningTime="2026-02-28 09:20:22.232097306 +0000 UTC m=+1013.922666643" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.658471 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.744740 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl42m\" (UniqueName: \"kubernetes.io/projected/694d7626-7d52-4f55-a8c3-79feaec0e5e2-kube-api-access-fl42m\") pod \"694d7626-7d52-4f55-a8c3-79feaec0e5e2\" (UID: \"694d7626-7d52-4f55-a8c3-79feaec0e5e2\") " Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.752797 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694d7626-7d52-4f55-a8c3-79feaec0e5e2-kube-api-access-fl42m" (OuterVolumeSpecName: "kube-api-access-fl42m") pod "694d7626-7d52-4f55-a8c3-79feaec0e5e2" (UID: "694d7626-7d52-4f55-a8c3-79feaec0e5e2"). InnerVolumeSpecName "kube-api-access-fl42m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:22 crc kubenswrapper[4687]: I0228 09:20:22.846338 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl42m\" (UniqueName: \"kubernetes.io/projected/694d7626-7d52-4f55-a8c3-79feaec0e5e2-kube-api-access-fl42m\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.036546 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.153947 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.154216 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537840-2wf4q" event={"ID":"694d7626-7d52-4f55-a8c3-79feaec0e5e2","Type":"ContainerDied","Data":"df24fc2f26be7adfe59cf6c0dc1ed98329fe165675aded440a7c3ce41b821f45"} Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.154284 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df24fc2f26be7adfe59cf6c0dc1ed98329fe165675aded440a7c3ce41b821f45" Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.361050 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.715871 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537834-rb4jt"] Feb 28 09:20:23 crc kubenswrapper[4687]: I0228 09:20:23.720815 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537834-rb4jt"] Feb 28 09:20:24 crc kubenswrapper[4687]: I0228 09:20:24.169927 4687 generic.go:334] "Generic (PLEG): container finished" podID="2c8490bf-32fb-4d04-974d-b2ca311f4b55" containerID="1b40e63fdc515073d911fb94bcf36a2b6d45554c85eea334cfe3d4e5db74cfbc" exitCode=0 Feb 28 09:20:24 crc kubenswrapper[4687]: I0228 09:20:24.170010 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v6s24" event={"ID":"2c8490bf-32fb-4d04-974d-b2ca311f4b55","Type":"ContainerDied","Data":"1b40e63fdc515073d911fb94bcf36a2b6d45554c85eea334cfe3d4e5db74cfbc"} Feb 28 09:20:24 crc kubenswrapper[4687]: I0228 09:20:24.666619 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290aa17f-f371-42cf-875b-7166fc432dd2" path="/var/lib/kubelet/pods/290aa17f-f371-42cf-875b-7166fc432dd2/volumes" Feb 28 09:20:26 crc kubenswrapper[4687]: I0228 09:20:26.409617 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.181319 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mcfl6" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.190869 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-logs\") pod \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.191172 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-scripts\") pod \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.191193 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-config-data\") pod \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.191228 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-combined-ca-bundle\") pod \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.191259 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnpfp\" (UniqueName: \"kubernetes.io/projected/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-kube-api-access-rnpfp\") pod \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\" (UID: \"ef1fa0a3-ab49-4807-a503-3a51a2b70e26\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.193830 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-logs" (OuterVolumeSpecName: "logs") pod "ef1fa0a3-ab49-4807-a503-3a51a2b70e26" (UID: "ef1fa0a3-ab49-4807-a503-3a51a2b70e26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.196033 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-scripts" (OuterVolumeSpecName: "scripts") pod "ef1fa0a3-ab49-4807-a503-3a51a2b70e26" (UID: "ef1fa0a3-ab49-4807-a503-3a51a2b70e26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.202932 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v6s24" event={"ID":"2c8490bf-32fb-4d04-974d-b2ca311f4b55","Type":"ContainerDied","Data":"f0aa72df824eac21ad8f9e9e794e77b6b2f6f4fc4b7353cb04b062168c72aef5"} Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.202988 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0aa72df824eac21ad8f9e9e794e77b6b2f6f4fc4b7353cb04b062168c72aef5" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.207191 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-kube-api-access-rnpfp" (OuterVolumeSpecName: "kube-api-access-rnpfp") pod "ef1fa0a3-ab49-4807-a503-3a51a2b70e26" (UID: "ef1fa0a3-ab49-4807-a503-3a51a2b70e26"). InnerVolumeSpecName "kube-api-access-rnpfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.214166 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mcfl6" event={"ID":"ef1fa0a3-ab49-4807-a503-3a51a2b70e26","Type":"ContainerDied","Data":"af5ef71ae62dd4647fb3bf15e5cca54837665fb3a538a4d8c05c340eaa099ec8"} Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.214198 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af5ef71ae62dd4647fb3bf15e5cca54837665fb3a538a4d8c05c340eaa099ec8" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.214251 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mcfl6" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.235866 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-config-data" (OuterVolumeSpecName: "config-data") pod "ef1fa0a3-ab49-4807-a503-3a51a2b70e26" (UID: "ef1fa0a3-ab49-4807-a503-3a51a2b70e26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.248287 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef1fa0a3-ab49-4807-a503-3a51a2b70e26" (UID: "ef1fa0a3-ab49-4807-a503-3a51a2b70e26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.293896 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.293954 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.293968 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.293980 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.293996 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnpfp\" (UniqueName: \"kubernetes.io/projected/ef1fa0a3-ab49-4807-a503-3a51a2b70e26-kube-api-access-rnpfp\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.321609 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.395242 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-scripts\") pod \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.396005 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6dt5\" (UniqueName: \"kubernetes.io/projected/2c8490bf-32fb-4d04-974d-b2ca311f4b55-kube-api-access-l6dt5\") pod \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.396219 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-config-data\") pod \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.396379 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-combined-ca-bundle\") pod \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.396422 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-credential-keys\") pod \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.396478 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-fernet-keys\") pod \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\" (UID: \"2c8490bf-32fb-4d04-974d-b2ca311f4b55\") " Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.400098 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8490bf-32fb-4d04-974d-b2ca311f4b55-kube-api-access-l6dt5" (OuterVolumeSpecName: "kube-api-access-l6dt5") pod "2c8490bf-32fb-4d04-974d-b2ca311f4b55" (UID: "2c8490bf-32fb-4d04-974d-b2ca311f4b55"). InnerVolumeSpecName "kube-api-access-l6dt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.401152 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-scripts" (OuterVolumeSpecName: "scripts") pod "2c8490bf-32fb-4d04-974d-b2ca311f4b55" (UID: "2c8490bf-32fb-4d04-974d-b2ca311f4b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.402667 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2c8490bf-32fb-4d04-974d-b2ca311f4b55" (UID: "2c8490bf-32fb-4d04-974d-b2ca311f4b55"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.408203 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2c8490bf-32fb-4d04-974d-b2ca311f4b55" (UID: "2c8490bf-32fb-4d04-974d-b2ca311f4b55"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.432198 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c8490bf-32fb-4d04-974d-b2ca311f4b55" (UID: "2c8490bf-32fb-4d04-974d-b2ca311f4b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.442079 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-config-data" (OuterVolumeSpecName: "config-data") pod "2c8490bf-32fb-4d04-974d-b2ca311f4b55" (UID: "2c8490bf-32fb-4d04-974d-b2ca311f4b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.499059 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.499093 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6dt5\" (UniqueName: \"kubernetes.io/projected/2c8490bf-32fb-4d04-974d-b2ca311f4b55-kube-api-access-l6dt5\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.499105 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.499118 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.499127 4687 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:27 crc kubenswrapper[4687]: I0228 09:20:27.499135 4687 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8490bf-32fb-4d04-974d-b2ca311f4b55-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.257175 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v6s24" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.257386 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerStarted","Data":"ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7"} Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.280469 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f6cfc745b-qklfm"] Feb 28 09:20:28 crc kubenswrapper[4687]: E0228 09:20:28.280844 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="dnsmasq-dns" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.280862 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="dnsmasq-dns" Feb 28 09:20:28 crc kubenswrapper[4687]: E0228 09:20:28.280875 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="init" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.280881 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="init" Feb 28 09:20:28 crc kubenswrapper[4687]: E0228 09:20:28.280903 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694d7626-7d52-4f55-a8c3-79feaec0e5e2" containerName="oc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.280908 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="694d7626-7d52-4f55-a8c3-79feaec0e5e2" containerName="oc" Feb 28 09:20:28 crc kubenswrapper[4687]: E0228 09:20:28.280933 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8490bf-32fb-4d04-974d-b2ca311f4b55" containerName="keystone-bootstrap" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.280939 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8490bf-32fb-4d04-974d-b2ca311f4b55" containerName="keystone-bootstrap" Feb 28 09:20:28 crc kubenswrapper[4687]: E0228 09:20:28.280947 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1fa0a3-ab49-4807-a503-3a51a2b70e26" containerName="placement-db-sync" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.280954 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1fa0a3-ab49-4807-a503-3a51a2b70e26" containerName="placement-db-sync" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.281132 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="694d7626-7d52-4f55-a8c3-79feaec0e5e2" containerName="oc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.281153 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8490bf-32fb-4d04-974d-b2ca311f4b55" containerName="keystone-bootstrap" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.281171 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="878defc9-19d4-48ce-92c3-9b0976de28d2" containerName="dnsmasq-dns" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.281191 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1fa0a3-ab49-4807-a503-3a51a2b70e26" containerName="placement-db-sync" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.282324 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.288880 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.289043 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.289188 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.289298 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-q9nmm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.289512 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.321248 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-scripts\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.321383 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42eabfaf-28a5-4986-ad88-a93859225843-logs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.321770 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-combined-ca-bundle\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.321953 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-config-data\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.322054 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-internal-tls-certs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.322569 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-public-tls-certs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.322713 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42qm\" (UniqueName: \"kubernetes.io/projected/42eabfaf-28a5-4986-ad88-a93859225843-kube-api-access-m42qm\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.335127 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f6cfc745b-qklfm"] Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425483 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-public-tls-certs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425556 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42qm\" (UniqueName: \"kubernetes.io/projected/42eabfaf-28a5-4986-ad88-a93859225843-kube-api-access-m42qm\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425598 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-scripts\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425617 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42eabfaf-28a5-4986-ad88-a93859225843-logs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425658 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-combined-ca-bundle\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425692 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-config-data\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.425721 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-internal-tls-certs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.429641 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42eabfaf-28a5-4986-ad88-a93859225843-logs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.431905 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-internal-tls-certs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.434818 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-combined-ca-bundle\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.436970 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-scripts\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.437417 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-public-tls-certs\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.444980 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42qm\" (UniqueName: \"kubernetes.io/projected/42eabfaf-28a5-4986-ad88-a93859225843-kube-api-access-m42qm\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.453654 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-config-data\") pod \"placement-6f6cfc745b-qklfm\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.488114 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8685d6f5dd-ndtlf"] Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.489488 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.495332 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.495492 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.495987 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.496230 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.496379 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.496508 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29qml" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.513549 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8685d6f5dd-ndtlf"] Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.527954 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-scripts\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528056 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-credential-keys\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528129 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-config-data\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528325 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-internal-tls-certs\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528362 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-combined-ca-bundle\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528428 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-public-tls-certs\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528586 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vj8q\" (UniqueName: \"kubernetes.io/projected/8fcd0fba-03d4-4584-b991-7f719e04b98d-kube-api-access-7vj8q\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.528653 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-fernet-keys\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.611012 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633123 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-credential-keys\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633205 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-config-data\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633288 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-internal-tls-certs\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633315 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-combined-ca-bundle\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633348 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-public-tls-certs\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633440 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vj8q\" (UniqueName: \"kubernetes.io/projected/8fcd0fba-03d4-4584-b991-7f719e04b98d-kube-api-access-7vj8q\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633490 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-fernet-keys\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.633523 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-scripts\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.635460 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.636429 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.636507 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.636597 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.636672 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.641221 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-combined-ca-bundle\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.664938 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-config-data\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.665835 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-public-tls-certs\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.669535 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-fernet-keys\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.669703 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-internal-tls-certs\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.670416 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-credential-keys\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.670656 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fcd0fba-03d4-4584-b991-7f719e04b98d-scripts\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.678663 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vj8q\" (UniqueName: \"kubernetes.io/projected/8fcd0fba-03d4-4584-b991-7f719e04b98d-kube-api-access-7vj8q\") pod \"keystone-8685d6f5dd-ndtlf\" (UID: \"8fcd0fba-03d4-4584-b991-7f719e04b98d\") " pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.841435 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29qml" Feb 28 09:20:28 crc kubenswrapper[4687]: I0228 09:20:28.844663 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.155639 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f6cfc745b-qklfm"] Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.268249 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f6cfc745b-qklfm" event={"ID":"42eabfaf-28a5-4986-ad88-a93859225843","Type":"ContainerStarted","Data":"bb33055feadf667676e8af073e956194ad6a1d84e4fbdb30ed526dd9a37339b7"} Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.330224 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8685d6f5dd-ndtlf"] Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.393578 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.393617 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.516287 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 28 09:20:29 crc kubenswrapper[4687]: I0228 09:20:29.523373 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.307955 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvkm8" event={"ID":"21a39679-80b0-4a80-ad64-fe3707c2a9f0","Type":"ContainerStarted","Data":"eafa8aa189b8dc205e3d2d50f710b6dc7d3538bb809e3c71ada4b453013fb30d"} Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.347895 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8685d6f5dd-ndtlf" event={"ID":"8fcd0fba-03d4-4584-b991-7f719e04b98d","Type":"ContainerStarted","Data":"951a6cde0af1ed810afdbca74b66bf1e361421226cabd3d338eab03f2aec3be8"} Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.347962 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8685d6f5dd-ndtlf" event={"ID":"8fcd0fba-03d4-4584-b991-7f719e04b98d","Type":"ContainerStarted","Data":"8c29f02153dd5d1be0242eb552663248e205afeeab67c8e5b801bfbae0cddd58"} Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.348768 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.353361 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mvkm8" podStartSLOduration=3.140166995 podStartE2EDuration="38.35333992s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="2026-02-28 09:19:54.032380924 +0000 UTC m=+985.722950262" lastFinishedPulling="2026-02-28 09:20:29.24555385 +0000 UTC m=+1020.936123187" observedRunningTime="2026-02-28 09:20:30.33677872 +0000 UTC m=+1022.027348057" watchObservedRunningTime="2026-02-28 09:20:30.35333992 +0000 UTC m=+1022.043909257" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.373120 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f6cfc745b-qklfm" event={"ID":"42eabfaf-28a5-4986-ad88-a93859225843","Type":"ContainerStarted","Data":"b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62"} Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.373167 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f6cfc745b-qklfm" event={"ID":"42eabfaf-28a5-4986-ad88-a93859225843","Type":"ContainerStarted","Data":"f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729"} Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.373187 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.374083 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.379729 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8685d6f5dd-ndtlf" podStartSLOduration=2.379710479 podStartE2EDuration="2.379710479s" podCreationTimestamp="2026-02-28 09:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:30.378212222 +0000 UTC m=+1022.068781559" watchObservedRunningTime="2026-02-28 09:20:30.379710479 +0000 UTC m=+1022.070279816" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.391270 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.391320 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.412533 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f6cfc745b-qklfm" podStartSLOduration=2.412512505 podStartE2EDuration="2.412512505s" podCreationTimestamp="2026-02-28 09:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:30.396582173 +0000 UTC m=+1022.087151520" watchObservedRunningTime="2026-02-28 09:20:30.412512505 +0000 UTC m=+1022.103081843" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.432291 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.449243 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.476218 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.560874 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-bw8wq"] Feb 28 09:20:30 crc kubenswrapper[4687]: I0228 09:20:30.561125 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" podUID="baee8d66-1152-499a-9e04-1c58353c4651" containerName="dnsmasq-dns" containerID="cri-o://505a526a74270e82d4537e98fc87928deb20bfb6ac074cbc2be0f77b8932a155" gracePeriod=10 Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.393376 4687 generic.go:334] "Generic (PLEG): container finished" podID="baee8d66-1152-499a-9e04-1c58353c4651" containerID="505a526a74270e82d4537e98fc87928deb20bfb6ac074cbc2be0f77b8932a155" exitCode=0 Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.393458 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" event={"ID":"baee8d66-1152-499a-9e04-1c58353c4651","Type":"ContainerDied","Data":"505a526a74270e82d4537e98fc87928deb20bfb6ac074cbc2be0f77b8932a155"} Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.394448 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.394477 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.394489 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.394498 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.558384 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.624805 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-sb\") pod \"baee8d66-1152-499a-9e04-1c58353c4651\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.624977 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9f75\" (UniqueName: \"kubernetes.io/projected/baee8d66-1152-499a-9e04-1c58353c4651-kube-api-access-q9f75\") pod \"baee8d66-1152-499a-9e04-1c58353c4651\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.625009 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-svc\") pod \"baee8d66-1152-499a-9e04-1c58353c4651\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.625065 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-config\") pod \"baee8d66-1152-499a-9e04-1c58353c4651\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.625083 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-nb\") pod \"baee8d66-1152-499a-9e04-1c58353c4651\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.625146 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-swift-storage-0\") pod \"baee8d66-1152-499a-9e04-1c58353c4651\" (UID: \"baee8d66-1152-499a-9e04-1c58353c4651\") " Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.637542 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baee8d66-1152-499a-9e04-1c58353c4651-kube-api-access-q9f75" (OuterVolumeSpecName: "kube-api-access-q9f75") pod "baee8d66-1152-499a-9e04-1c58353c4651" (UID: "baee8d66-1152-499a-9e04-1c58353c4651"). InnerVolumeSpecName "kube-api-access-q9f75". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.708512 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "baee8d66-1152-499a-9e04-1c58353c4651" (UID: "baee8d66-1152-499a-9e04-1c58353c4651"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.725499 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "baee8d66-1152-499a-9e04-1c58353c4651" (UID: "baee8d66-1152-499a-9e04-1c58353c4651"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.729824 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-config" (OuterVolumeSpecName: "config") pod "baee8d66-1152-499a-9e04-1c58353c4651" (UID: "baee8d66-1152-499a-9e04-1c58353c4651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.730226 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9f75\" (UniqueName: \"kubernetes.io/projected/baee8d66-1152-499a-9e04-1c58353c4651-kube-api-access-q9f75\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.730254 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.730263 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.730271 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.738799 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "baee8d66-1152-499a-9e04-1c58353c4651" (UID: "baee8d66-1152-499a-9e04-1c58353c4651"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.765477 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "baee8d66-1152-499a-9e04-1c58353c4651" (UID: "baee8d66-1152-499a-9e04-1c58353c4651"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.832737 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:31 crc kubenswrapper[4687]: I0228 09:20:31.832763 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/baee8d66-1152-499a-9e04-1c58353c4651-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.081118 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.187322 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9587f844-jq5pd" podUID="113841cd-f813-4ee0-93cf-2e3cfb43f6fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.409716 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c9j72" event={"ID":"3e5e221e-73c7-44a2-9af9-0feb60b412e0","Type":"ContainerStarted","Data":"36d8793b5506960f0edd95fae453cc7431c4d82d7aee4458db381af12f245d6b"} Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.422111 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" event={"ID":"baee8d66-1152-499a-9e04-1c58353c4651","Type":"ContainerDied","Data":"02c68db94f383f8a0fd02f1ccdb13fd58afdf08441908981d131fe71a1cc72d7"} Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.422143 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-bw8wq" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.422399 4687 scope.go:117] "RemoveContainer" containerID="505a526a74270e82d4537e98fc87928deb20bfb6ac074cbc2be0f77b8932a155" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.431959 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-c9j72" podStartSLOduration=4.049988297 podStartE2EDuration="40.431950281s" podCreationTimestamp="2026-02-28 09:19:52 +0000 UTC" firstStartedPulling="2026-02-28 09:19:53.883629181 +0000 UTC m=+985.574198519" lastFinishedPulling="2026-02-28 09:20:30.265591166 +0000 UTC m=+1021.956160503" observedRunningTime="2026-02-28 09:20:32.427086794 +0000 UTC m=+1024.117656131" watchObservedRunningTime="2026-02-28 09:20:32.431950281 +0000 UTC m=+1024.122519618" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.434659 4687 generic.go:334] "Generic (PLEG): container finished" podID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" containerID="eafa8aa189b8dc205e3d2d50f710b6dc7d3538bb809e3c71ada4b453013fb30d" exitCode=0 Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.434749 4687 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.434759 4687 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.434800 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvkm8" event={"ID":"21a39679-80b0-4a80-ad64-fe3707c2a9f0","Type":"ContainerDied","Data":"eafa8aa189b8dc205e3d2d50f710b6dc7d3538bb809e3c71ada4b453013fb30d"} Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.449864 4687 scope.go:117] "RemoveContainer" containerID="cf77963f4f0d9b79eeebfb11e4dee3a877ce783225e125d32a8afd5304756876" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.473893 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-bw8wq"] Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.481260 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-bw8wq"] Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.666430 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baee8d66-1152-499a-9e04-1c58353c4651" path="/var/lib/kubelet/pods/baee8d66-1152-499a-9e04-1c58353c4651/volumes" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.720799 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 28 09:20:32 crc kubenswrapper[4687]: I0228 09:20:32.722002 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.348466 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d6696bd5b-vf747"] Feb 28 09:20:33 crc kubenswrapper[4687]: E0228 09:20:33.348959 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baee8d66-1152-499a-9e04-1c58353c4651" containerName="init" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.349043 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="baee8d66-1152-499a-9e04-1c58353c4651" containerName="init" Feb 28 09:20:33 crc kubenswrapper[4687]: E0228 09:20:33.349120 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baee8d66-1152-499a-9e04-1c58353c4651" containerName="dnsmasq-dns" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.349171 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="baee8d66-1152-499a-9e04-1c58353c4651" containerName="dnsmasq-dns" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.349378 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="baee8d66-1152-499a-9e04-1c58353c4651" containerName="dnsmasq-dns" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.350315 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.379064 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6696bd5b-vf747"] Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.416592 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.455743 4687 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.473964 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j52qg\" (UniqueName: \"kubernetes.io/projected/0aa8b593-6c7b-438e-b95c-3f39081df0ea-kube-api-access-j52qg\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.474043 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-config-data\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.474084 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-internal-tls-certs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.474141 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-scripts\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.474162 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-combined-ca-bundle\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.474181 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aa8b593-6c7b-438e-b95c-3f39081df0ea-logs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.474226 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-public-tls-certs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.477644 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576260 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-config-data\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576576 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-internal-tls-certs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576730 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-scripts\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576763 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-combined-ca-bundle\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576780 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aa8b593-6c7b-438e-b95c-3f39081df0ea-logs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576894 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-public-tls-certs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.576985 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j52qg\" (UniqueName: \"kubernetes.io/projected/0aa8b593-6c7b-438e-b95c-3f39081df0ea-kube-api-access-j52qg\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.578986 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aa8b593-6c7b-438e-b95c-3f39081df0ea-logs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.583415 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-internal-tls-certs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.583554 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-combined-ca-bundle\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.589364 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-scripts\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.594534 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-config-data\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.605844 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa8b593-6c7b-438e-b95c-3f39081df0ea-public-tls-certs\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.610472 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j52qg\" (UniqueName: \"kubernetes.io/projected/0aa8b593-6c7b-438e-b95c-3f39081df0ea-kube-api-access-j52qg\") pod \"placement-d6696bd5b-vf747\" (UID: \"0aa8b593-6c7b-438e-b95c-3f39081df0ea\") " pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.666574 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.927382 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.986090 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk7lz\" (UniqueName: \"kubernetes.io/projected/21a39679-80b0-4a80-ad64-fe3707c2a9f0-kube-api-access-fk7lz\") pod \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.986178 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-combined-ca-bundle\") pod \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.986240 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-db-sync-config-data\") pod \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\" (UID: \"21a39679-80b0-4a80-ad64-fe3707c2a9f0\") " Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.995246 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a39679-80b0-4a80-ad64-fe3707c2a9f0-kube-api-access-fk7lz" (OuterVolumeSpecName: "kube-api-access-fk7lz") pod "21a39679-80b0-4a80-ad64-fe3707c2a9f0" (UID: "21a39679-80b0-4a80-ad64-fe3707c2a9f0"). InnerVolumeSpecName "kube-api-access-fk7lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:33 crc kubenswrapper[4687]: I0228 09:20:33.998534 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "21a39679-80b0-4a80-ad64-fe3707c2a9f0" (UID: "21a39679-80b0-4a80-ad64-fe3707c2a9f0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.019152 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21a39679-80b0-4a80-ad64-fe3707c2a9f0" (UID: "21a39679-80b0-4a80-ad64-fe3707c2a9f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.089397 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.089435 4687 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21a39679-80b0-4a80-ad64-fe3707c2a9f0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.089445 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk7lz\" (UniqueName: \"kubernetes.io/projected/21a39679-80b0-4a80-ad64-fe3707c2a9f0-kube-api-access-fk7lz\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.226101 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d6696bd5b-vf747"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.465713 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mvkm8" event={"ID":"21a39679-80b0-4a80-ad64-fe3707c2a9f0","Type":"ContainerDied","Data":"f65eb8b258925b98bf73a7698d47c9bd18a95ae83349919a8e0172d797496532"} Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.465770 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65eb8b258925b98bf73a7698d47c9bd18a95ae83349919a8e0172d797496532" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.465977 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mvkm8" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.708627 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6586f4f898-ssm26"] Feb 28 09:20:34 crc kubenswrapper[4687]: E0228 09:20:34.718090 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" containerName="barbican-db-sync" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.718123 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" containerName="barbican-db-sync" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.718444 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" containerName="barbican-db-sync" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.719503 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.724830 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6586f4f898-ssm26"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.735824 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.736104 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.736243 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bmtlj" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.742884 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f58cc8c7c-dxx99"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.744677 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.746192 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.759533 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f58cc8c7c-dxx99"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.817081 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tzjqk"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.818620 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.868073 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tzjqk"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.903932 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blv56\" (UniqueName: \"kubernetes.io/projected/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-kube-api-access-blv56\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904499 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nlq\" (UniqueName: \"kubernetes.io/projected/cc722f81-31b0-44eb-8206-4256e2ae12f0-kube-api-access-86nlq\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904631 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-config-data\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904764 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc722f81-31b0-44eb-8206-4256e2ae12f0-logs\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904830 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-config-data\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904886 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-combined-ca-bundle\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904913 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-logs\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.904988 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-config-data-custom\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.905134 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-combined-ca-bundle\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.905181 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-config-data-custom\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.938116 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d8d4bb8d-87zwm"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.957282 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8d4bb8d-87zwm"] Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.957412 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:34 crc kubenswrapper[4687]: I0228 09:20:34.969136 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.015001 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nlq\" (UniqueName: \"kubernetes.io/projected/cc722f81-31b0-44eb-8206-4256e2ae12f0-kube-api-access-86nlq\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.015276 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-config\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.015462 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-config-data\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.015615 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.015779 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc722f81-31b0-44eb-8206-4256e2ae12f0-logs\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.015881 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz676\" (UniqueName: \"kubernetes.io/projected/bad05ef2-b8b3-4844-a104-7bf24d1398b0-kube-api-access-xz676\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016010 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-config-data\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016125 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016236 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016305 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-combined-ca-bundle\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016391 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-logs\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016491 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016613 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-config-data-custom\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016760 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-combined-ca-bundle\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.016842 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-config-data-custom\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.017008 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blv56\" (UniqueName: \"kubernetes.io/projected/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-kube-api-access-blv56\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.023199 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-logs\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.027617 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc722f81-31b0-44eb-8206-4256e2ae12f0-logs\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.035197 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nlq\" (UniqueName: \"kubernetes.io/projected/cc722f81-31b0-44eb-8206-4256e2ae12f0-kube-api-access-86nlq\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.041131 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-config-data\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.041230 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-config-data-custom\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.041547 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blv56\" (UniqueName: \"kubernetes.io/projected/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-kube-api-access-blv56\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.046437 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec85d56-f00e-4193-b4eb-ae0d43a13ffa-combined-ca-bundle\") pod \"barbican-worker-5f58cc8c7c-dxx99\" (UID: \"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa\") " pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.046988 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-config-data\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.055537 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-config-data-custom\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.062597 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f58cc8c7c-dxx99" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.076537 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc722f81-31b0-44eb-8206-4256e2ae12f0-combined-ca-bundle\") pod \"barbican-keystone-listener-6586f4f898-ssm26\" (UID: \"cc722f81-31b0-44eb-8206-4256e2ae12f0\") " pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132331 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-combined-ca-bundle\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132582 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-config\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132607 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132629 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa88f1b2-477c-461c-a044-88fd35c31231-logs\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132718 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132757 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data-custom\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132828 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz676\" (UniqueName: \"kubernetes.io/projected/bad05ef2-b8b3-4844-a104-7bf24d1398b0-kube-api-access-xz676\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132853 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132890 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132927 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.132953 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxdn6\" (UniqueName: \"kubernetes.io/projected/aa88f1b2-477c-461c-a044-88fd35c31231-kube-api-access-lxdn6\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.134178 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-config\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.134802 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.135256 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.135740 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.136676 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.157782 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz676\" (UniqueName: \"kubernetes.io/projected/bad05ef2-b8b3-4844-a104-7bf24d1398b0-kube-api-access-xz676\") pod \"dnsmasq-dns-8449d68f4f-tzjqk\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.158566 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.240716 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.240780 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa88f1b2-477c-461c-a044-88fd35c31231-logs\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.240922 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data-custom\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.241593 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxdn6\" (UniqueName: \"kubernetes.io/projected/aa88f1b2-477c-461c-a044-88fd35c31231-kube-api-access-lxdn6\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.241663 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-combined-ca-bundle\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.241755 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa88f1b2-477c-461c-a044-88fd35c31231-logs\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.248574 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data-custom\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.249506 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.259252 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-combined-ca-bundle\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.265452 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxdn6\" (UniqueName: \"kubernetes.io/projected/aa88f1b2-477c-461c-a044-88fd35c31231-kube-api-access-lxdn6\") pod \"barbican-api-5d8d4bb8d-87zwm\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.315544 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:35 crc kubenswrapper[4687]: I0228 09:20:35.337265 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" Feb 28 09:20:36 crc kubenswrapper[4687]: I0228 09:20:36.486573 4687 generic.go:334] "Generic (PLEG): container finished" podID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" containerID="36d8793b5506960f0edd95fae453cc7431c4d82d7aee4458db381af12f245d6b" exitCode=0 Feb 28 09:20:36 crc kubenswrapper[4687]: I0228 09:20:36.486647 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c9j72" event={"ID":"3e5e221e-73c7-44a2-9af9-0feb60b412e0","Type":"ContainerDied","Data":"36d8793b5506960f0edd95fae453cc7431c4d82d7aee4458db381af12f245d6b"} Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.848184 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f95b8bb44-tjzcn"] Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.854788 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.860419 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.860632 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.869037 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f95b8bb44-tjzcn"] Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994113 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-public-tls-certs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994161 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-config-data-custom\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994293 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-internal-tls-certs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994363 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa58d12c-eed3-46e2-915f-c8383b8949fe-logs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994524 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l8vg\" (UniqueName: \"kubernetes.io/projected/fa58d12c-eed3-46e2-915f-c8383b8949fe-kube-api-access-4l8vg\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994641 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-config-data\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:37 crc kubenswrapper[4687]: I0228 09:20:37.994744 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-combined-ca-bundle\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.098081 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-config-data-custom\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.098990 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-public-tls-certs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.099063 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-internal-tls-certs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.099113 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa58d12c-eed3-46e2-915f-c8383b8949fe-logs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.099235 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l8vg\" (UniqueName: \"kubernetes.io/projected/fa58d12c-eed3-46e2-915f-c8383b8949fe-kube-api-access-4l8vg\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.099321 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-config-data\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.099439 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-combined-ca-bundle\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.102416 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa58d12c-eed3-46e2-915f-c8383b8949fe-logs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.106712 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-config-data-custom\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.110546 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-combined-ca-bundle\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.110632 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-public-tls-certs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.111747 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-config-data\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.115869 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l8vg\" (UniqueName: \"kubernetes.io/projected/fa58d12c-eed3-46e2-915f-c8383b8949fe-kube-api-access-4l8vg\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.123158 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa58d12c-eed3-46e2-915f-c8383b8949fe-internal-tls-certs\") pod \"barbican-api-6f95b8bb44-tjzcn\" (UID: \"fa58d12c-eed3-46e2-915f-c8383b8949fe\") " pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.188088 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.641594 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c9j72" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.813509 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42fhc\" (UniqueName: \"kubernetes.io/projected/3e5e221e-73c7-44a2-9af9-0feb60b412e0-kube-api-access-42fhc\") pod \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.813851 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-scripts\") pod \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.813945 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-db-sync-config-data\") pod \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.813994 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e5e221e-73c7-44a2-9af9-0feb60b412e0-etc-machine-id\") pod \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.814124 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-config-data\") pod \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.814273 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-combined-ca-bundle\") pod \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\" (UID: \"3e5e221e-73c7-44a2-9af9-0feb60b412e0\") " Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.814345 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5e221e-73c7-44a2-9af9-0feb60b412e0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3e5e221e-73c7-44a2-9af9-0feb60b412e0" (UID: "3e5e221e-73c7-44a2-9af9-0feb60b412e0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.814774 4687 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e5e221e-73c7-44a2-9af9-0feb60b412e0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.818547 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e5e221e-73c7-44a2-9af9-0feb60b412e0-kube-api-access-42fhc" (OuterVolumeSpecName: "kube-api-access-42fhc") pod "3e5e221e-73c7-44a2-9af9-0feb60b412e0" (UID: "3e5e221e-73c7-44a2-9af9-0feb60b412e0"). InnerVolumeSpecName "kube-api-access-42fhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.819073 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3e5e221e-73c7-44a2-9af9-0feb60b412e0" (UID: "3e5e221e-73c7-44a2-9af9-0feb60b412e0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.822468 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-scripts" (OuterVolumeSpecName: "scripts") pod "3e5e221e-73c7-44a2-9af9-0feb60b412e0" (UID: "3e5e221e-73c7-44a2-9af9-0feb60b412e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.841847 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e5e221e-73c7-44a2-9af9-0feb60b412e0" (UID: "3e5e221e-73c7-44a2-9af9-0feb60b412e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.854838 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-config-data" (OuterVolumeSpecName: "config-data") pod "3e5e221e-73c7-44a2-9af9-0feb60b412e0" (UID: "3e5e221e-73c7-44a2-9af9-0feb60b412e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.917387 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.917417 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.917431 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42fhc\" (UniqueName: \"kubernetes.io/projected/3e5e221e-73c7-44a2-9af9-0feb60b412e0-kube-api-access-42fhc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.917441 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:38 crc kubenswrapper[4687]: I0228 09:20:38.917450 4687 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e5e221e-73c7-44a2-9af9-0feb60b412e0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.516899 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6696bd5b-vf747" event={"ID":"0aa8b593-6c7b-438e-b95c-3f39081df0ea","Type":"ContainerStarted","Data":"034336a30cf1c051f4d19338adc8ce8a517bf0e5fc7b8efe9e95ba983ce1390c"} Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.518143 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-c9j72" event={"ID":"3e5e221e-73c7-44a2-9af9-0feb60b412e0","Type":"ContainerDied","Data":"2208c8fed1e88f5b5b0ba488bcffee0b225598f3f7537481f3ae92ba150a8d1d"} Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.518165 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2208c8fed1e88f5b5b0ba488bcffee0b225598f3f7537481f3ae92ba150a8d1d" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.518243 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-c9j72" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.829748 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6586f4f898-ssm26"] Feb 28 09:20:39 crc kubenswrapper[4687]: E0228 09:20:39.879235 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.977715 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:39 crc kubenswrapper[4687]: E0228 09:20:39.978458 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" containerName="cinder-db-sync" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.978583 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" containerName="cinder-db-sync" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.979062 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" containerName="cinder-db-sync" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.980433 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.990327 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.990601 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.990840 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t48hh" Feb 28 09:20:39 crc kubenswrapper[4687]: I0228 09:20:39.991472 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.004872 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.037698 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tzjqk"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.069140 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-5h7z4"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.070682 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.085984 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tzjqk"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.099589 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-5h7z4"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.114324 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f58cc8c7c-dxx99"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.134264 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8d4bb8d-87zwm"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.155415 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.155456 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.155500 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6k8f\" (UniqueName: \"kubernetes.io/projected/b82a8aed-cc7b-4802-80f0-63e701ee0593-kube-api-access-h6k8f\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.155535 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82a8aed-cc7b-4802-80f0-63e701ee0593-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.155566 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-scripts\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.155604 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.180505 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.182548 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.184657 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.189296 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.196443 4687 scope.go:117] "RemoveContainer" containerID="278b6ec447930ef86d38816dae5a24aeeb143cda885ae03ff21e8c66d663fc24" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.255959 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f95b8bb44-tjzcn"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257600 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257627 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257664 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6k8f\" (UniqueName: \"kubernetes.io/projected/b82a8aed-cc7b-4802-80f0-63e701ee0593-kube-api-access-h6k8f\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257695 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257714 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257736 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82a8aed-cc7b-4802-80f0-63e701ee0593-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257783 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-config\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257814 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-scripts\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257849 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh7xs\" (UniqueName: \"kubernetes.io/projected/1af91582-eba0-43db-8e20-00caea60a31a-kube-api-access-zh7xs\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257880 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257897 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.257939 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.258092 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82a8aed-cc7b-4802-80f0-63e701ee0593-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.265794 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.269610 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.274432 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-scripts\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.278475 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6k8f\" (UniqueName: \"kubernetes.io/projected/b82a8aed-cc7b-4802-80f0-63e701ee0593-kube-api-access-h6k8f\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.285422 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data\") pod \"cinder-scheduler-0\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.343048 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360350 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcfj4\" (UniqueName: \"kubernetes.io/projected/e2c76976-8cdb-45e0-826d-5d465de1829c-kube-api-access-lcfj4\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360423 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c76976-8cdb-45e0-826d-5d465de1829c-logs\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360463 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360500 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-scripts\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360535 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360549 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360570 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-config\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360609 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360639 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh7xs\" (UniqueName: \"kubernetes.io/projected/1af91582-eba0-43db-8e20-00caea60a31a-kube-api-access-zh7xs\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.360661 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.361051 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.361087 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2c76976-8cdb-45e0-826d-5d465de1829c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.361127 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.361866 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.362574 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.362683 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-config\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.363250 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.363299 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.382100 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh7xs\" (UniqueName: \"kubernetes.io/projected/1af91582-eba0-43db-8e20-00caea60a31a-kube-api-access-zh7xs\") pod \"dnsmasq-dns-7b8fcc65cc-5h7z4\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.398249 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.464767 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.464872 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.464907 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2c76976-8cdb-45e0-826d-5d465de1829c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.464992 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcfj4\" (UniqueName: \"kubernetes.io/projected/e2c76976-8cdb-45e0-826d-5d465de1829c-kube-api-access-lcfj4\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.465087 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c76976-8cdb-45e0-826d-5d465de1829c-logs\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.465110 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2c76976-8cdb-45e0-826d-5d465de1829c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.465143 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.465537 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c76976-8cdb-45e0-826d-5d465de1829c-logs\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.465618 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-scripts\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.468449 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.468639 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-scripts\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.469764 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.470188 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.480239 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcfj4\" (UniqueName: \"kubernetes.io/projected/e2c76976-8cdb-45e0-826d-5d465de1829c-kube-api-access-lcfj4\") pod \"cinder-api-0\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.543355 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.591563 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6696bd5b-vf747" event={"ID":"0aa8b593-6c7b-438e-b95c-3f39081df0ea","Type":"ContainerStarted","Data":"5b37965806bae7991073f98dbc5660ffa50a1aae8ca7c1004a1130fbd7850044"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.594134 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.594232 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d6696bd5b-vf747" event={"ID":"0aa8b593-6c7b-438e-b95c-3f39081df0ea","Type":"ContainerStarted","Data":"ac9ca3faafd64ad4de665a7da6ed009138ac3c04b0d609cf0c7d1ff3753510bb"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.594967 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.609622 4687 generic.go:334] "Generic (PLEG): container finished" podID="bad05ef2-b8b3-4844-a104-7bf24d1398b0" containerID="f3715a88a6d5f3c06cbbbf5922e4eccb7a3ef9dc5886bf0692e44025483cf67d" exitCode=0 Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.609876 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" event={"ID":"bad05ef2-b8b3-4844-a104-7bf24d1398b0","Type":"ContainerDied","Data":"f3715a88a6d5f3c06cbbbf5922e4eccb7a3ef9dc5886bf0692e44025483cf67d"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.609915 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" event={"ID":"bad05ef2-b8b3-4844-a104-7bf24d1398b0","Type":"ContainerStarted","Data":"e275141dc783b7e016fe1f9a7b672780f44f4b5eda17227d320d2d4c645d15aa"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.618830 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.627810 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8d4bb8d-87zwm" event={"ID":"aa88f1b2-477c-461c-a044-88fd35c31231","Type":"ContainerStarted","Data":"20c3398c07b4a58ec0440d2377669a98032c546a99db3287a0bf00e017a3e7b6"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.633124 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" event={"ID":"cc722f81-31b0-44eb-8206-4256e2ae12f0","Type":"ContainerStarted","Data":"d2e03bc994a1cdbeed2b5a970081da8ea6fc1327bfc9d510dd74e59d23c91883"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.641608 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerStarted","Data":"a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.641752 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="ceilometer-notification-agent" containerID="cri-o://404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de" gracePeriod=30 Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.642514 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.642571 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="proxy-httpd" containerID="cri-o://a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3" gracePeriod=30 Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.642619 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="sg-core" containerID="cri-o://ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7" gracePeriod=30 Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.644349 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d6696bd5b-vf747" podStartSLOduration=7.6443314959999995 podStartE2EDuration="7.644331496s" podCreationTimestamp="2026-02-28 09:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:40.631075712 +0000 UTC m=+1032.321645059" watchObservedRunningTime="2026-02-28 09:20:40.644331496 +0000 UTC m=+1032.334900832" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.648932 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f95b8bb44-tjzcn" event={"ID":"fa58d12c-eed3-46e2-915f-c8383b8949fe","Type":"ContainerStarted","Data":"71d64962c93be2715c2bb8826aa77e2c0bc9fa920e926218645fdacb9f7e0554"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.650281 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f58cc8c7c-dxx99" event={"ID":"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa","Type":"ContainerStarted","Data":"5d37c16422d58ea742d15a5482e77f9e3ed339cb7af852483758939c4559ead0"} Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.888483 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.926699 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bd77ccf75-bqx56"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.927066 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bd77ccf75-bqx56" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-api" containerID="cri-o://51b219e86f3b0d6b4919b070002226d15fce4b8fe16494e79bab096be1e39e20" gracePeriod=30 Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.927137 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bd77ccf75-bqx56" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-httpd" containerID="cri-o://1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e" gracePeriod=30 Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.968474 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.992504 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bd86ccc79-8jlb2"] Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.994089 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.997618 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-httpd-config\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:40 crc kubenswrapper[4687]: I0228 09:20:40.997672 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-config\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.007130 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntj7w\" (UniqueName: \"kubernetes.io/projected/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-kube-api-access-ntj7w\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.007227 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-internal-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.007493 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-ovndb-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.007529 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-combined-ca-bundle\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.007601 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-public-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.036172 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd86ccc79-8jlb2"] Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.070936 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-5h7z4"] Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.102668 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110334 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-ovndb-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110378 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-combined-ca-bundle\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110440 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-public-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110488 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-httpd-config\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110516 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-config\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110686 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntj7w\" (UniqueName: \"kubernetes.io/projected/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-kube-api-access-ntj7w\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.110722 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-internal-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.115382 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-internal-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.121686 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-config\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.122600 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-public-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.123038 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-httpd-config\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.136668 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntj7w\" (UniqueName: \"kubernetes.io/projected/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-kube-api-access-ntj7w\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.140964 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-ovndb-tls-certs\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.141693 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a-combined-ca-bundle\") pod \"neutron-6bd86ccc79-8jlb2\" (UID: \"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a\") " pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.211841 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-svc\") pod \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.212109 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz676\" (UniqueName: \"kubernetes.io/projected/bad05ef2-b8b3-4844-a104-7bf24d1398b0-kube-api-access-xz676\") pod \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.212147 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-config\") pod \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.219469 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-swift-storage-0\") pod \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.219586 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-nb\") pod \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.219711 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-sb\") pod \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\" (UID: \"bad05ef2-b8b3-4844-a104-7bf24d1398b0\") " Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.231041 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.259542 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad05ef2-b8b3-4844-a104-7bf24d1398b0-kube-api-access-xz676" (OuterVolumeSpecName: "kube-api-access-xz676") pod "bad05ef2-b8b3-4844-a104-7bf24d1398b0" (UID: "bad05ef2-b8b3-4844-a104-7bf24d1398b0"). InnerVolumeSpecName "kube-api-access-xz676". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.265933 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bad05ef2-b8b3-4844-a104-7bf24d1398b0" (UID: "bad05ef2-b8b3-4844-a104-7bf24d1398b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.270242 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-config" (OuterVolumeSpecName: "config") pod "bad05ef2-b8b3-4844-a104-7bf24d1398b0" (UID: "bad05ef2-b8b3-4844-a104-7bf24d1398b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.274507 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bad05ef2-b8b3-4844-a104-7bf24d1398b0" (UID: "bad05ef2-b8b3-4844-a104-7bf24d1398b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.322945 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.322972 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.322983 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz676\" (UniqueName: \"kubernetes.io/projected/bad05ef2-b8b3-4844-a104-7bf24d1398b0-kube-api-access-xz676\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.322992 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.342643 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.371554 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bad05ef2-b8b3-4844-a104-7bf24d1398b0" (UID: "bad05ef2-b8b3-4844-a104-7bf24d1398b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.372560 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bad05ef2-b8b3-4844-a104-7bf24d1398b0" (UID: "bad05ef2-b8b3-4844-a104-7bf24d1398b0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.426441 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.426473 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bad05ef2-b8b3-4844-a104-7bf24d1398b0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.720662 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f95b8bb44-tjzcn" event={"ID":"fa58d12c-eed3-46e2-915f-c8383b8949fe","Type":"ContainerStarted","Data":"cef382ffbdc14311e61210013d3d9df25f9cf91a79504a55966b2580e6553f32"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.720926 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f95b8bb44-tjzcn" event={"ID":"fa58d12c-eed3-46e2-915f-c8383b8949fe","Type":"ContainerStarted","Data":"004e4644c7912526dbe25ae3f0a8849683361cb5038837677f2e9a7a4f6dea1a"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.723119 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.723187 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.730919 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" event={"ID":"bad05ef2-b8b3-4844-a104-7bf24d1398b0","Type":"ContainerDied","Data":"e275141dc783b7e016fe1f9a7b672780f44f4b5eda17227d320d2d4c645d15aa"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.730954 4687 scope.go:117] "RemoveContainer" containerID="f3715a88a6d5f3c06cbbbf5922e4eccb7a3ef9dc5886bf0692e44025483cf67d" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.731181 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tzjqk" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.752340 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2c76976-8cdb-45e0-826d-5d465de1829c","Type":"ContainerStarted","Data":"2c07a016be4806dbeeebb6a66368796619fe84cad3c0cf4b0095514545cab7d3"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.756109 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f95b8bb44-tjzcn" podStartSLOduration=4.756098922 podStartE2EDuration="4.756098922s" podCreationTimestamp="2026-02-28 09:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:41.739990695 +0000 UTC m=+1033.430560032" watchObservedRunningTime="2026-02-28 09:20:41.756098922 +0000 UTC m=+1033.446668260" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.795052 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8d4bb8d-87zwm" event={"ID":"aa88f1b2-477c-461c-a044-88fd35c31231","Type":"ContainerStarted","Data":"b1e3eae08f7f2f92c12748feeba374abf768df6a9244158975b18d0b40305051"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.795122 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8d4bb8d-87zwm" event={"ID":"aa88f1b2-477c-461c-a044-88fd35c31231","Type":"ContainerStarted","Data":"a9fe336de21c280d3de01f19fc9fddfa3d280f1561ff146cf049d7465c378b2d"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.796572 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tzjqk"] Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.796622 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.796640 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.804011 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tzjqk"] Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.826777 4687 generic.go:334] "Generic (PLEG): container finished" podID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerID="a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3" exitCode=0 Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.826804 4687 generic.go:334] "Generic (PLEG): container finished" podID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerID="ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7" exitCode=2 Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.826868 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerDied","Data":"a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.826895 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerDied","Data":"ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.844239 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d8d4bb8d-87zwm" podStartSLOduration=7.84422166 podStartE2EDuration="7.84422166s" podCreationTimestamp="2026-02-28 09:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:41.824520585 +0000 UTC m=+1033.515089932" watchObservedRunningTime="2026-02-28 09:20:41.84422166 +0000 UTC m=+1033.534790997" Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.857141 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b82a8aed-cc7b-4802-80f0-63e701ee0593","Type":"ContainerStarted","Data":"2cbfd448dfbfd7ff8b1c920092f8eb02bd9ac026d2b29b479aec909800756385"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.859241 4687 generic.go:334] "Generic (PLEG): container finished" podID="1af91582-eba0-43db-8e20-00caea60a31a" containerID="1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec" exitCode=0 Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.859479 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" event={"ID":"1af91582-eba0-43db-8e20-00caea60a31a","Type":"ContainerDied","Data":"1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.859518 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" event={"ID":"1af91582-eba0-43db-8e20-00caea60a31a","Type":"ContainerStarted","Data":"28cc9f792b72eebf39235a0c18c3bd2f077465c537ca68227c3a35ceea3b9b29"} Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.868732 4687 generic.go:334] "Generic (PLEG): container finished" podID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerID="1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e" exitCode=0 Feb 28 09:20:41 crc kubenswrapper[4687]: I0228 09:20:41.869825 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bd77ccf75-bqx56" event={"ID":"d655bdf4-33ab-45fa-b1e4-c37aede5609a","Type":"ContainerDied","Data":"1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e"} Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.104398 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd86ccc79-8jlb2"] Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.619080 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.687902 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad05ef2-b8b3-4844-a104-7bf24d1398b0" path="/var/lib/kubelet/pods/bad05ef2-b8b3-4844-a104-7bf24d1398b0/volumes" Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.884684 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b82a8aed-cc7b-4802-80f0-63e701ee0593","Type":"ContainerStarted","Data":"76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09"} Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.887144 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" event={"ID":"1af91582-eba0-43db-8e20-00caea60a31a","Type":"ContainerStarted","Data":"6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99"} Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.888441 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.890127 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd86ccc79-8jlb2" event={"ID":"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a","Type":"ContainerStarted","Data":"9a0be9d5972d20156f167efc46e59e42e703ecf158fb6f063b52a7ff8e203748"} Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.890152 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd86ccc79-8jlb2" event={"ID":"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a","Type":"ContainerStarted","Data":"29f57c534273628103ae8cdae5065f7ed2376a248a1077403ca36a71b8399568"} Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.898554 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2c76976-8cdb-45e0-826d-5d465de1829c","Type":"ContainerStarted","Data":"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace"} Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.904542 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" podStartSLOduration=2.904533237 podStartE2EDuration="2.904533237s" podCreationTimestamp="2026-02-28 09:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:42.901310003 +0000 UTC m=+1034.591879340" watchObservedRunningTime="2026-02-28 09:20:42.904533237 +0000 UTC m=+1034.595102573" Feb 28 09:20:42 crc kubenswrapper[4687]: I0228 09:20:42.951669 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5bd77ccf75-bqx56" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.806577 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.911321 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd86ccc79-8jlb2" event={"ID":"2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a","Type":"ContainerStarted","Data":"c6ec84b6a245b27ccff3e6073ff026742961bd1fd0dc97a2d42bbab5a2eec28e"} Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.912748 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.914814 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f58cc8c7c-dxx99" event={"ID":"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa","Type":"ContainerStarted","Data":"9e8a40d87282367bda4bcadda718cd477169838028754ac5429fc9c4b05c66d8"} Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.924547 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" event={"ID":"cc722f81-31b0-44eb-8206-4256e2ae12f0","Type":"ContainerStarted","Data":"4f5207c10bcaa6d2b838557f4cad76f9d83e03491fa677ba31aa10c0d6a488f7"} Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.954219 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:43 crc kubenswrapper[4687]: I0228 09:20:43.975392 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bd86ccc79-8jlb2" podStartSLOduration=3.975381971 podStartE2EDuration="3.975381971s" podCreationTimestamp="2026-02-28 09:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:43.943410617 +0000 UTC m=+1035.633979944" watchObservedRunningTime="2026-02-28 09:20:43.975381971 +0000 UTC m=+1035.665951309" Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.932221 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2c76976-8cdb-45e0-826d-5d465de1829c","Type":"ContainerStarted","Data":"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00"} Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.932335 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api-log" containerID="cri-o://1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace" gracePeriod=30 Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.932422 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api" containerID="cri-o://e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00" gracePeriod=30 Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.932636 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.937975 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b82a8aed-cc7b-4802-80f0-63e701ee0593","Type":"ContainerStarted","Data":"85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841"} Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.940322 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" event={"ID":"cc722f81-31b0-44eb-8206-4256e2ae12f0","Type":"ContainerStarted","Data":"eb008422354d5301676e562979a127427e3840a6da6ec7f60dea1e2c10c24777"} Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.942762 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f58cc8c7c-dxx99" event={"ID":"5ec85d56-f00e-4193-b4eb-ae0d43a13ffa","Type":"ContainerStarted","Data":"f2217e79a118fe2e964760118c80553fd16efa41641b0054e2735ebd9b88ce8e"} Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.951284 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.9512734940000005 podStartE2EDuration="4.951273494s" podCreationTimestamp="2026-02-28 09:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:44.947903696 +0000 UTC m=+1036.638473033" watchObservedRunningTime="2026-02-28 09:20:44.951273494 +0000 UTC m=+1036.641842831" Feb 28 09:20:44 crc kubenswrapper[4687]: I0228 09:20:44.973329 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6586f4f898-ssm26" podStartSLOduration=7.728424571 podStartE2EDuration="10.973311545s" podCreationTimestamp="2026-02-28 09:20:34 +0000 UTC" firstStartedPulling="2026-02-28 09:20:39.835072426 +0000 UTC m=+1031.525641763" lastFinishedPulling="2026-02-28 09:20:43.079959399 +0000 UTC m=+1034.770528737" observedRunningTime="2026-02-28 09:20:44.967637583 +0000 UTC m=+1036.658206919" watchObservedRunningTime="2026-02-28 09:20:44.973311545 +0000 UTC m=+1036.663880882" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.007533 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f58cc8c7c-dxx99" podStartSLOduration=8.052453897 podStartE2EDuration="11.007506711s" podCreationTimestamp="2026-02-28 09:20:34 +0000 UTC" firstStartedPulling="2026-02-28 09:20:40.124330472 +0000 UTC m=+1031.814899809" lastFinishedPulling="2026-02-28 09:20:43.079383286 +0000 UTC m=+1034.769952623" observedRunningTime="2026-02-28 09:20:44.993361305 +0000 UTC m=+1036.683930642" watchObservedRunningTime="2026-02-28 09:20:45.007506711 +0000 UTC m=+1036.698076048" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.032813 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.903065063 podStartE2EDuration="6.032790146s" podCreationTimestamp="2026-02-28 09:20:39 +0000 UTC" firstStartedPulling="2026-02-28 09:20:40.928254407 +0000 UTC m=+1032.618823744" lastFinishedPulling="2026-02-28 09:20:42.05797949 +0000 UTC m=+1033.748548827" observedRunningTime="2026-02-28 09:20:45.016657703 +0000 UTC m=+1036.707227050" watchObservedRunningTime="2026-02-28 09:20:45.032790146 +0000 UTC m=+1036.723359483" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.338159 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.609296 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b9587f844-jq5pd" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.672826 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d58956cb6-f8plp"] Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.673089 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon-log" containerID="cri-o://72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761" gracePeriod=30 Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.674209 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" containerID="cri-o://57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324" gracePeriod=30 Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.686555 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.836784 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946159 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946266 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-combined-ca-bundle\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946378 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-scripts\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946608 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data-custom\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946640 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c76976-8cdb-45e0-826d-5d465de1829c-logs\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946667 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2c76976-8cdb-45e0-826d-5d465de1829c-etc-machine-id\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.946745 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcfj4\" (UniqueName: \"kubernetes.io/projected/e2c76976-8cdb-45e0-826d-5d465de1829c-kube-api-access-lcfj4\") pod \"e2c76976-8cdb-45e0-826d-5d465de1829c\" (UID: \"e2c76976-8cdb-45e0-826d-5d465de1829c\") " Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.947348 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2c76976-8cdb-45e0-826d-5d465de1829c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.947357 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2c76976-8cdb-45e0-826d-5d465de1829c-logs" (OuterVolumeSpecName: "logs") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.954256 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-scripts" (OuterVolumeSpecName: "scripts") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.957858 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.959826 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c76976-8cdb-45e0-826d-5d465de1829c-kube-api-access-lcfj4" (OuterVolumeSpecName: "kube-api-access-lcfj4") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "kube-api-access-lcfj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963214 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerID="e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00" exitCode=0 Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963250 4687 generic.go:334] "Generic (PLEG): container finished" podID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerID="1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace" exitCode=143 Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963300 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2c76976-8cdb-45e0-826d-5d465de1829c","Type":"ContainerDied","Data":"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00"} Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963330 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2c76976-8cdb-45e0-826d-5d465de1829c","Type":"ContainerDied","Data":"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace"} Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963347 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963340 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2c76976-8cdb-45e0-826d-5d465de1829c","Type":"ContainerDied","Data":"2c07a016be4806dbeeebb6a66368796619fe84cad3c0cf4b0095514545cab7d3"} Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.963386 4687 scope.go:117] "RemoveContainer" containerID="e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.986872 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.991324 4687 generic.go:334] "Generic (PLEG): container finished" podID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerID="404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de" exitCode=0 Feb 28 09:20:45 crc kubenswrapper[4687]: I0228 09:20:45.993294 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerDied","Data":"404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de"} Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.016221 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data" (OuterVolumeSpecName: "config-data") pod "e2c76976-8cdb-45e0-826d-5d465de1829c" (UID: "e2c76976-8cdb-45e0-826d-5d465de1829c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053754 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053780 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053793 4687 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053803 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c76976-8cdb-45e0-826d-5d465de1829c-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053811 4687 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2c76976-8cdb-45e0-826d-5d465de1829c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053821 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcfj4\" (UniqueName: \"kubernetes.io/projected/e2c76976-8cdb-45e0-826d-5d465de1829c-kube-api-access-lcfj4\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.053831 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c76976-8cdb-45e0-826d-5d465de1829c-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.111764 4687 scope.go:117] "RemoveContainer" containerID="1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.131996 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.136650 4687 scope.go:117] "RemoveContainer" containerID="e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.136992 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00\": container with ID starting with e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00 not found: ID does not exist" containerID="e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137039 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00"} err="failed to get container status \"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00\": rpc error: code = NotFound desc = could not find container \"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00\": container with ID starting with e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00 not found: ID does not exist" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137070 4687 scope.go:117] "RemoveContainer" containerID="1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.137332 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace\": container with ID starting with 1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace not found: ID does not exist" containerID="1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137348 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace"} err="failed to get container status \"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace\": rpc error: code = NotFound desc = could not find container \"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace\": container with ID starting with 1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace not found: ID does not exist" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137360 4687 scope.go:117] "RemoveContainer" containerID="e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137659 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00"} err="failed to get container status \"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00\": rpc error: code = NotFound desc = could not find container \"e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00\": container with ID starting with e6a89344c10867261364545fc098816806109be63afd73b0b07cdc65a69c5a00 not found: ID does not exist" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137676 4687 scope.go:117] "RemoveContainer" containerID="1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.137949 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace"} err="failed to get container status \"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace\": rpc error: code = NotFound desc = could not find container \"1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace\": container with ID starting with 1b04a115a7dbb0f4ad683d7685204049b7616de2f4a65926728bdcd67cc13ace not found: ID does not exist" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.155161 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvxd\" (UniqueName: \"kubernetes.io/projected/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-kube-api-access-2wvxd\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.155313 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-log-httpd\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.155397 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-sg-core-conf-yaml\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.155510 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-combined-ca-bundle\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.155633 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-config-data\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.155743 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.156224 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-run-httpd\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.156369 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-scripts\") pod \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\" (UID: \"2a0893a8-0386-4d6d-9476-c061c3fb5f3d\") " Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.156513 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.157119 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.157183 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.159836 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-scripts" (OuterVolumeSpecName: "scripts") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.159879 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-kube-api-access-2wvxd" (OuterVolumeSpecName: "kube-api-access-2wvxd") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "kube-api-access-2wvxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.177750 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.196212 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.216283 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-config-data" (OuterVolumeSpecName: "config-data") pod "2a0893a8-0386-4d6d-9476-c061c3fb5f3d" (UID: "2a0893a8-0386-4d6d-9476-c061c3fb5f3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.258975 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.259143 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvxd\" (UniqueName: \"kubernetes.io/projected/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-kube-api-access-2wvxd\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.259206 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.259259 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.259317 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a0893a8-0386-4d6d-9476-c061c3fb5f3d-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.300089 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.310756 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318354 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.318750 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api-log" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318762 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api-log" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.318772 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="ceilometer-notification-agent" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318777 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="ceilometer-notification-agent" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.318789 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318795 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.318801 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="sg-core" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318807 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="sg-core" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.318821 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad05ef2-b8b3-4844-a104-7bf24d1398b0" containerName="init" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318827 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad05ef2-b8b3-4844-a104-7bf24d1398b0" containerName="init" Feb 28 09:20:46 crc kubenswrapper[4687]: E0228 09:20:46.318845 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="proxy-httpd" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.318851 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="proxy-httpd" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319012 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api-log" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319037 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="ceilometer-notification-agent" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319047 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="sg-core" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319067 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad05ef2-b8b3-4844-a104-7bf24d1398b0" containerName="init" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319080 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" containerName="proxy-httpd" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319089 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" containerName="cinder-api" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.319976 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.322828 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.323068 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.323300 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.329224 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.360727 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.360770 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.360796 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7902e63-a118-4905-ad9d-3a4d15edce78-logs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.360820 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7902e63-a118-4905-ad9d-3a4d15edce78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.360835 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8k7\" (UniqueName: \"kubernetes.io/projected/c7902e63-a118-4905-ad9d-3a4d15edce78-kube-api-access-cg8k7\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.360856 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.361127 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.361166 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-scripts\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.361257 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-config-data\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463080 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463141 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-scripts\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463245 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-config-data\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463344 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463375 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463391 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7902e63-a118-4905-ad9d-3a4d15edce78-logs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463417 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7902e63-a118-4905-ad9d-3a4d15edce78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463439 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8k7\" (UniqueName: \"kubernetes.io/projected/c7902e63-a118-4905-ad9d-3a4d15edce78-kube-api-access-cg8k7\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.463460 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.467892 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7902e63-a118-4905-ad9d-3a4d15edce78-logs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.467973 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7902e63-a118-4905-ad9d-3a4d15edce78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.467991 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.470701 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.471087 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.471456 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-scripts\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.475882 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-config-data\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.480068 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7902e63-a118-4905-ad9d-3a4d15edce78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.482799 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8k7\" (UniqueName: \"kubernetes.io/projected/c7902e63-a118-4905-ad9d-3a4d15edce78-kube-api-access-cg8k7\") pod \"cinder-api-0\" (UID: \"c7902e63-a118-4905-ad9d-3a4d15edce78\") " pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.645390 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.675216 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c76976-8cdb-45e0-826d-5d465de1829c" path="/var/lib/kubelet/pods/e2c76976-8cdb-45e0-826d-5d465de1829c/volumes" Feb 28 09:20:46 crc kubenswrapper[4687]: I0228 09:20:46.758172 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.003691 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a0893a8-0386-4d6d-9476-c061c3fb5f3d","Type":"ContainerDied","Data":"b83edc249187f94706cb88fa7b442c63cc2c247afe76eefd355ca88641fe4c06"} Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.003732 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.003752 4687 scope.go:117] "RemoveContainer" containerID="a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.047014 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.052129 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.058524 4687 scope.go:117] "RemoveContainer" containerID="ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.060109 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.075568 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.077687 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.099668 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.100279 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.139991 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.156578 4687 scope.go:117] "RemoveContainer" containerID="404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.178145 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-scripts\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.178508 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7vl\" (UniqueName: \"kubernetes.io/projected/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-kube-api-access-hm7vl\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.178803 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.178979 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-log-httpd\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.179097 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-config-data\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.179185 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-run-httpd\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.179413 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.280855 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-scripts\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.280960 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7vl\" (UniqueName: \"kubernetes.io/projected/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-kube-api-access-hm7vl\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.281005 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.281070 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-log-httpd\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.281093 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-config-data\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.281113 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-run-httpd\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.281199 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.282526 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-log-httpd\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.283617 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-run-httpd\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.287439 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.288454 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.295589 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-config-data\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.299903 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7vl\" (UniqueName: \"kubernetes.io/projected/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-kube-api-access-hm7vl\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.305304 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-scripts\") pod \"ceilometer-0\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " pod="openstack/ceilometer-0" Feb 28 09:20:47 crc kubenswrapper[4687]: I0228 09:20:47.419689 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:20:48 crc kubenswrapper[4687]: I0228 09:20:48.036974 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:20:48 crc kubenswrapper[4687]: I0228 09:20:48.054873 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7902e63-a118-4905-ad9d-3a4d15edce78","Type":"ContainerStarted","Data":"5f25377d1c9b3faf29248abb7129d431de6e0c29a5fbb5822f817ff08f1522ba"} Feb 28 09:20:48 crc kubenswrapper[4687]: I0228 09:20:48.055007 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7902e63-a118-4905-ad9d-3a4d15edce78","Type":"ContainerStarted","Data":"dca4c62b92bbadaccf9dd50b8f3ea9c4ae2eea49f1c257a495eec1743a9b2aa9"} Feb 28 09:20:48 crc kubenswrapper[4687]: I0228 09:20:48.325415 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:48 crc kubenswrapper[4687]: I0228 09:20:48.677771 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0893a8-0386-4d6d-9476-c061c3fb5f3d" path="/var/lib/kubelet/pods/2a0893a8-0386-4d6d-9476-c061c3fb5f3d/volumes" Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.066789 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerStarted","Data":"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795"} Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.067100 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerStarted","Data":"6f04d7668d3a0f6c9e5c4526624926cdd5fefdd5b3f4afa0334344c5ce1f2d8e"} Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.069846 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7902e63-a118-4905-ad9d-3a4d15edce78","Type":"ContainerStarted","Data":"8fc87e1e1cc04d18ded1eee9cb98cd4fc06170225e9e897250198c263383489e"} Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.070011 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.095050 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.095015314 podStartE2EDuration="3.095015314s" podCreationTimestamp="2026-02-28 09:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:49.086799836 +0000 UTC m=+1040.777369172" watchObservedRunningTime="2026-02-28 09:20:49.095015314 +0000 UTC m=+1040.785584651" Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.096415 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:42482->10.217.0.152:8443: read: connection reset by peer" Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.712227 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.715745 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f95b8bb44-tjzcn" Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.812176 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d8d4bb8d-87zwm"] Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.812384 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d8d4bb8d-87zwm" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api-log" containerID="cri-o://a9fe336de21c280d3de01f19fc9fddfa3d280f1561ff146cf049d7465c378b2d" gracePeriod=30 Feb 28 09:20:49 crc kubenswrapper[4687]: I0228 09:20:49.812726 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d8d4bb8d-87zwm" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api" containerID="cri-o://b1e3eae08f7f2f92c12748feeba374abf768df6a9244158975b18d0b40305051" gracePeriod=30 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.082475 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerStarted","Data":"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.084758 4687 generic.go:334] "Generic (PLEG): container finished" podID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerID="1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318" exitCode=137 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.084805 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6774d8fcc9-lpttg" event={"ID":"76f683cb-cc38-4cdd-a0f0-1077410b1768","Type":"ContainerDied","Data":"1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.086776 4687 generic.go:334] "Generic (PLEG): container finished" podID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerID="57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324" exitCode=0 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.086819 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d58956cb6-f8plp" event={"ID":"6a06887c-91c5-43bb-8631-53fac29e79b6","Type":"ContainerDied","Data":"57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.087886 4687 generic.go:334] "Generic (PLEG): container finished" podID="aa88f1b2-477c-461c-a044-88fd35c31231" containerID="a9fe336de21c280d3de01f19fc9fddfa3d280f1561ff146cf049d7465c378b2d" exitCode=143 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.087933 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8d4bb8d-87zwm" event={"ID":"aa88f1b2-477c-461c-a044-88fd35c31231","Type":"ContainerDied","Data":"a9fe336de21c280d3de01f19fc9fddfa3d280f1561ff146cf049d7465c378b2d"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.089433 4687 generic.go:334] "Generic (PLEG): container finished" podID="84c40408-c638-4bea-86d5-fb40a60b6975" containerID="b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111" exitCode=137 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.089470 4687 generic.go:334] "Generic (PLEG): container finished" podID="84c40408-c638-4bea-86d5-fb40a60b6975" containerID="61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a" exitCode=137 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.089510 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b795df4f-65xfj" event={"ID":"84c40408-c638-4bea-86d5-fb40a60b6975","Type":"ContainerDied","Data":"b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.089531 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b795df4f-65xfj" event={"ID":"84c40408-c638-4bea-86d5-fb40a60b6975","Type":"ContainerDied","Data":"61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.092628 4687 generic.go:334] "Generic (PLEG): container finished" podID="27799696-4eb6-4ef9-9440-151a3929d699" containerID="aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28" exitCode=137 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.093566 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94db9c8bf-6qj27" event={"ID":"27799696-4eb6-4ef9-9440-151a3929d699","Type":"ContainerDied","Data":"aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28"} Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.379797 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.401438 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.451904 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-config-data\") pod \"84c40408-c638-4bea-86d5-fb40a60b6975\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.451953 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84c40408-c638-4bea-86d5-fb40a60b6975-horizon-secret-key\") pod \"84c40408-c638-4bea-86d5-fb40a60b6975\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.451988 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4jsm\" (UniqueName: \"kubernetes.io/projected/84c40408-c638-4bea-86d5-fb40a60b6975-kube-api-access-h4jsm\") pod \"84c40408-c638-4bea-86d5-fb40a60b6975\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.452013 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-scripts\") pod \"84c40408-c638-4bea-86d5-fb40a60b6975\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.452152 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c40408-c638-4bea-86d5-fb40a60b6975-logs\") pod \"84c40408-c638-4bea-86d5-fb40a60b6975\" (UID: \"84c40408-c638-4bea-86d5-fb40a60b6975\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.454663 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c40408-c638-4bea-86d5-fb40a60b6975-logs" (OuterVolumeSpecName: "logs") pod "84c40408-c638-4bea-86d5-fb40a60b6975" (UID: "84c40408-c638-4bea-86d5-fb40a60b6975"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.482138 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c40408-c638-4bea-86d5-fb40a60b6975-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "84c40408-c638-4bea-86d5-fb40a60b6975" (UID: "84c40408-c638-4bea-86d5-fb40a60b6975"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.484600 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c40408-c638-4bea-86d5-fb40a60b6975-kube-api-access-h4jsm" (OuterVolumeSpecName: "kube-api-access-h4jsm") pod "84c40408-c638-4bea-86d5-fb40a60b6975" (UID: "84c40408-c638-4bea-86d5-fb40a60b6975"). InnerVolumeSpecName "kube-api-access-h4jsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.503957 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-config-data" (OuterVolumeSpecName: "config-data") pod "84c40408-c638-4bea-86d5-fb40a60b6975" (UID: "84c40408-c638-4bea-86d5-fb40a60b6975"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.532603 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-scripts" (OuterVolumeSpecName: "scripts") pod "84c40408-c638-4bea-86d5-fb40a60b6975" (UID: "84c40408-c638-4bea-86d5-fb40a60b6975"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.541995 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-crzs5"] Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.542282 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerName="dnsmasq-dns" containerID="cri-o://99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd" gracePeriod=10 Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.555302 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c40408-c638-4bea-86d5-fb40a60b6975-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.555330 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.555345 4687 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84c40408-c638-4bea-86d5-fb40a60b6975-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.555355 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4jsm\" (UniqueName: \"kubernetes.io/projected/84c40408-c638-4bea-86d5-fb40a60b6975-kube-api-access-h4jsm\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.555364 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84c40408-c638-4bea-86d5-fb40a60b6975-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.710286 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.759919 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.763138 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.775092 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.865806 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76f683cb-cc38-4cdd-a0f0-1077410b1768-horizon-secret-key\") pod \"76f683cb-cc38-4cdd-a0f0-1077410b1768\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.865861 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/27799696-4eb6-4ef9-9440-151a3929d699-horizon-secret-key\") pod \"27799696-4eb6-4ef9-9440-151a3929d699\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.865920 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27799696-4eb6-4ef9-9440-151a3929d699-logs\") pod \"27799696-4eb6-4ef9-9440-151a3929d699\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.865977 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-config-data\") pod \"27799696-4eb6-4ef9-9440-151a3929d699\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866121 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-scripts\") pod \"27799696-4eb6-4ef9-9440-151a3929d699\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866244 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv864\" (UniqueName: \"kubernetes.io/projected/27799696-4eb6-4ef9-9440-151a3929d699-kube-api-access-sv864\") pod \"27799696-4eb6-4ef9-9440-151a3929d699\" (UID: \"27799696-4eb6-4ef9-9440-151a3929d699\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866333 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vg96\" (UniqueName: \"kubernetes.io/projected/76f683cb-cc38-4cdd-a0f0-1077410b1768-kube-api-access-4vg96\") pod \"76f683cb-cc38-4cdd-a0f0-1077410b1768\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866363 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-config-data\") pod \"76f683cb-cc38-4cdd-a0f0-1077410b1768\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866392 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-scripts\") pod \"76f683cb-cc38-4cdd-a0f0-1077410b1768\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866431 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76f683cb-cc38-4cdd-a0f0-1077410b1768-logs\") pod \"76f683cb-cc38-4cdd-a0f0-1077410b1768\" (UID: \"76f683cb-cc38-4cdd-a0f0-1077410b1768\") " Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.866609 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27799696-4eb6-4ef9-9440-151a3929d699-logs" (OuterVolumeSpecName: "logs") pod "27799696-4eb6-4ef9-9440-151a3929d699" (UID: "27799696-4eb6-4ef9-9440-151a3929d699"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.867869 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27799696-4eb6-4ef9-9440-151a3929d699-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.868227 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f683cb-cc38-4cdd-a0f0-1077410b1768-logs" (OuterVolumeSpecName: "logs") pod "76f683cb-cc38-4cdd-a0f0-1077410b1768" (UID: "76f683cb-cc38-4cdd-a0f0-1077410b1768"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.876676 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27799696-4eb6-4ef9-9440-151a3929d699-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "27799696-4eb6-4ef9-9440-151a3929d699" (UID: "27799696-4eb6-4ef9-9440-151a3929d699"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.876813 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f683cb-cc38-4cdd-a0f0-1077410b1768-kube-api-access-4vg96" (OuterVolumeSpecName: "kube-api-access-4vg96") pod "76f683cb-cc38-4cdd-a0f0-1077410b1768" (UID: "76f683cb-cc38-4cdd-a0f0-1077410b1768"). InnerVolumeSpecName "kube-api-access-4vg96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.877067 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76f683cb-cc38-4cdd-a0f0-1077410b1768-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "76f683cb-cc38-4cdd-a0f0-1077410b1768" (UID: "76f683cb-cc38-4cdd-a0f0-1077410b1768"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.897630 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27799696-4eb6-4ef9-9440-151a3929d699-kube-api-access-sv864" (OuterVolumeSpecName: "kube-api-access-sv864") pod "27799696-4eb6-4ef9-9440-151a3929d699" (UID: "27799696-4eb6-4ef9-9440-151a3929d699"). InnerVolumeSpecName "kube-api-access-sv864". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.919657 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-scripts" (OuterVolumeSpecName: "scripts") pod "76f683cb-cc38-4cdd-a0f0-1077410b1768" (UID: "76f683cb-cc38-4cdd-a0f0-1077410b1768"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.919853 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-config-data" (OuterVolumeSpecName: "config-data") pod "27799696-4eb6-4ef9-9440-151a3929d699" (UID: "27799696-4eb6-4ef9-9440-151a3929d699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.935524 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-scripts" (OuterVolumeSpecName: "scripts") pod "27799696-4eb6-4ef9-9440-151a3929d699" (UID: "27799696-4eb6-4ef9-9440-151a3929d699"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.936607 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-config-data" (OuterVolumeSpecName: "config-data") pod "76f683cb-cc38-4cdd-a0f0-1077410b1768" (UID: "76f683cb-cc38-4cdd-a0f0-1077410b1768"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969578 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969627 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27799696-4eb6-4ef9-9440-151a3929d699-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969638 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv864\" (UniqueName: \"kubernetes.io/projected/27799696-4eb6-4ef9-9440-151a3929d699-kube-api-access-sv864\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969650 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vg96\" (UniqueName: \"kubernetes.io/projected/76f683cb-cc38-4cdd-a0f0-1077410b1768-kube-api-access-4vg96\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969659 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969668 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76f683cb-cc38-4cdd-a0f0-1077410b1768-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969675 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76f683cb-cc38-4cdd-a0f0-1077410b1768-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969683 4687 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76f683cb-cc38-4cdd-a0f0-1077410b1768-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:50 crc kubenswrapper[4687]: I0228 09:20:50.969692 4687 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/27799696-4eb6-4ef9-9440-151a3929d699-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.067114 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.114749 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerStarted","Data":"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.127355 4687 generic.go:334] "Generic (PLEG): container finished" podID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerID="f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d" exitCode=137 Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.127424 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6774d8fcc9-lpttg" event={"ID":"76f683cb-cc38-4cdd-a0f0-1077410b1768","Type":"ContainerDied","Data":"f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.127454 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6774d8fcc9-lpttg" event={"ID":"76f683cb-cc38-4cdd-a0f0-1077410b1768","Type":"ContainerDied","Data":"e8d5b812f2edc197ed1fa8b0a0914b0152c058afe4646764eeeefcfa6ffe9e43"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.127472 4687 scope.go:117] "RemoveContainer" containerID="f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.127588 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6774d8fcc9-lpttg" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.150980 4687 generic.go:334] "Generic (PLEG): container finished" podID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerID="99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd" exitCode=0 Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.151080 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" event={"ID":"455f8be2-a725-49fb-ba76-6f3e6c4cb34d","Type":"ContainerDied","Data":"99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.151133 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" event={"ID":"455f8be2-a725-49fb-ba76-6f3e6c4cb34d","Type":"ContainerDied","Data":"9ba3383f945d7b2472026c92c72afaf80f70e31989b5540c8090bf0e0bff0dcd"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.151091 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-crzs5" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.169219 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b795df4f-65xfj" event={"ID":"84c40408-c638-4bea-86d5-fb40a60b6975","Type":"ContainerDied","Data":"0a0a86b425d00964404e37a811d5b05c915d79a2bfeb451d659a2a38fec5dd2f"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.169293 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b795df4f-65xfj" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.175367 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-swift-storage-0\") pod \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.175477 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvfqj\" (UniqueName: \"kubernetes.io/projected/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-kube-api-access-zvfqj\") pod \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.175544 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-sb\") pod \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.175636 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-svc\") pod \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.175664 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-config\") pod \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.175698 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-nb\") pod \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\" (UID: \"455f8be2-a725-49fb-ba76-6f3e6c4cb34d\") " Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.178007 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6774d8fcc9-lpttg"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.181905 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-kube-api-access-zvfqj" (OuterVolumeSpecName: "kube-api-access-zvfqj") pod "455f8be2-a725-49fb-ba76-6f3e6c4cb34d" (UID: "455f8be2-a725-49fb-ba76-6f3e6c4cb34d"). InnerVolumeSpecName "kube-api-access-zvfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.185001 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6774d8fcc9-lpttg"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.189674 4687 generic.go:334] "Generic (PLEG): container finished" podID="27799696-4eb6-4ef9-9440-151a3929d699" containerID="712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e" exitCode=137 Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.189832 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-94db9c8bf-6qj27" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.189906 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94db9c8bf-6qj27" event={"ID":"27799696-4eb6-4ef9-9440-151a3929d699","Type":"ContainerDied","Data":"712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.189940 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="cinder-scheduler" containerID="cri-o://76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09" gracePeriod=30 Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.190111 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="probe" containerID="cri-o://85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841" gracePeriod=30 Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.189948 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94db9c8bf-6qj27" event={"ID":"27799696-4eb6-4ef9-9440-151a3929d699","Type":"ContainerDied","Data":"9cb4ddb764f0c5a30b40d129e1c56024c04b6a19cb224b015cfc83c54194d2da"} Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.203643 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9b795df4f-65xfj"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.212433 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9b795df4f-65xfj"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.237804 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-94db9c8bf-6qj27"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.246002 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-94db9c8bf-6qj27"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.271165 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "455f8be2-a725-49fb-ba76-6f3e6c4cb34d" (UID: "455f8be2-a725-49fb-ba76-6f3e6c4cb34d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.271589 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "455f8be2-a725-49fb-ba76-6f3e6c4cb34d" (UID: "455f8be2-a725-49fb-ba76-6f3e6c4cb34d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.272668 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "455f8be2-a725-49fb-ba76-6f3e6c4cb34d" (UID: "455f8be2-a725-49fb-ba76-6f3e6c4cb34d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.278892 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.278920 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.278929 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.278938 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvfqj\" (UniqueName: \"kubernetes.io/projected/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-kube-api-access-zvfqj\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.280367 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "455f8be2-a725-49fb-ba76-6f3e6c4cb34d" (UID: "455f8be2-a725-49fb-ba76-6f3e6c4cb34d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.291923 4687 scope.go:117] "RemoveContainer" containerID="1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.312137 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-config" (OuterVolumeSpecName: "config") pod "455f8be2-a725-49fb-ba76-6f3e6c4cb34d" (UID: "455f8be2-a725-49fb-ba76-6f3e6c4cb34d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.315216 4687 scope.go:117] "RemoveContainer" containerID="f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d" Feb 28 09:20:51 crc kubenswrapper[4687]: E0228 09:20:51.315816 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d\": container with ID starting with f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d not found: ID does not exist" containerID="f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.315855 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d"} err="failed to get container status \"f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d\": rpc error: code = NotFound desc = could not find container \"f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d\": container with ID starting with f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d not found: ID does not exist" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.315899 4687 scope.go:117] "RemoveContainer" containerID="1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318" Feb 28 09:20:51 crc kubenswrapper[4687]: E0228 09:20:51.318630 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318\": container with ID starting with 1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318 not found: ID does not exist" containerID="1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.318652 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318"} err="failed to get container status \"1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318\": rpc error: code = NotFound desc = could not find container \"1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318\": container with ID starting with 1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318 not found: ID does not exist" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.318670 4687 scope.go:117] "RemoveContainer" containerID="99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.336865 4687 scope.go:117] "RemoveContainer" containerID="626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.361787 4687 scope.go:117] "RemoveContainer" containerID="99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd" Feb 28 09:20:51 crc kubenswrapper[4687]: E0228 09:20:51.362299 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd\": container with ID starting with 99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd not found: ID does not exist" containerID="99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.362335 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd"} err="failed to get container status \"99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd\": rpc error: code = NotFound desc = could not find container \"99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd\": container with ID starting with 99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd not found: ID does not exist" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.362367 4687 scope.go:117] "RemoveContainer" containerID="626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64" Feb 28 09:20:51 crc kubenswrapper[4687]: E0228 09:20:51.362789 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64\": container with ID starting with 626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64 not found: ID does not exist" containerID="626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.362836 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64"} err="failed to get container status \"626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64\": rpc error: code = NotFound desc = could not find container \"626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64\": container with ID starting with 626a1ffb5f3b5a18bb0918cd939d9fa5bb373a80a7f610e786ac81445a3c7d64 not found: ID does not exist" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.362867 4687 scope.go:117] "RemoveContainer" containerID="b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.380481 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.380510 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/455f8be2-a725-49fb-ba76-6f3e6c4cb34d-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.497040 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-crzs5"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.504760 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-crzs5"] Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.520747 4687 scope.go:117] "RemoveContainer" containerID="61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.552132 4687 scope.go:117] "RemoveContainer" containerID="712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.702556 4687 scope.go:117] "RemoveContainer" containerID="aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.732694 4687 scope.go:117] "RemoveContainer" containerID="712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e" Feb 28 09:20:51 crc kubenswrapper[4687]: E0228 09:20:51.733165 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e\": container with ID starting with 712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e not found: ID does not exist" containerID="712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.733208 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e"} err="failed to get container status \"712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e\": rpc error: code = NotFound desc = could not find container \"712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e\": container with ID starting with 712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e not found: ID does not exist" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.733239 4687 scope.go:117] "RemoveContainer" containerID="aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28" Feb 28 09:20:51 crc kubenswrapper[4687]: E0228 09:20:51.733681 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28\": container with ID starting with aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28 not found: ID does not exist" containerID="aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28" Feb 28 09:20:51 crc kubenswrapper[4687]: I0228 09:20:51.733721 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28"} err="failed to get container status \"aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28\": rpc error: code = NotFound desc = could not find container \"aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28\": container with ID starting with aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28 not found: ID does not exist" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.077662 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.203233 4687 generic.go:334] "Generic (PLEG): container finished" podID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerID="85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841" exitCode=0 Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.203309 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b82a8aed-cc7b-4802-80f0-63e701ee0593","Type":"ContainerDied","Data":"85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841"} Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.209964 4687 generic.go:334] "Generic (PLEG): container finished" podID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerID="51b219e86f3b0d6b4919b070002226d15fce4b8fe16494e79bab096be1e39e20" exitCode=0 Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.210043 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bd77ccf75-bqx56" event={"ID":"d655bdf4-33ab-45fa-b1e4-c37aede5609a","Type":"ContainerDied","Data":"51b219e86f3b0d6b4919b070002226d15fce4b8fe16494e79bab096be1e39e20"} Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.369302 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428423 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-combined-ca-bundle\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428539 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-internal-tls-certs\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428570 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-public-tls-certs\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428708 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-config\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428778 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5969\" (UniqueName: \"kubernetes.io/projected/d655bdf4-33ab-45fa-b1e4-c37aede5609a-kube-api-access-r5969\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428847 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-ovndb-tls-certs\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.428970 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-httpd-config\") pod \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\" (UID: \"d655bdf4-33ab-45fa-b1e4-c37aede5609a\") " Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.432509 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.435178 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d655bdf4-33ab-45fa-b1e4-c37aede5609a-kube-api-access-r5969" (OuterVolumeSpecName: "kube-api-access-r5969") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "kube-api-access-r5969". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.477038 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.488200 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-config" (OuterVolumeSpecName: "config") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.489648 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.501402 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.510575 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d655bdf4-33ab-45fa-b1e4-c37aede5609a" (UID: "d655bdf4-33ab-45fa-b1e4-c37aede5609a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531495 4687 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531524 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531537 4687 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531549 4687 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531559 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531569 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5969\" (UniqueName: \"kubernetes.io/projected/d655bdf4-33ab-45fa-b1e4-c37aede5609a-kube-api-access-r5969\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.531589 4687 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d655bdf4-33ab-45fa-b1e4-c37aede5609a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.668112 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27799696-4eb6-4ef9-9440-151a3929d699" path="/var/lib/kubelet/pods/27799696-4eb6-4ef9-9440-151a3929d699/volumes" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.668678 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" path="/var/lib/kubelet/pods/455f8be2-a725-49fb-ba76-6f3e6c4cb34d/volumes" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.669338 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" path="/var/lib/kubelet/pods/76f683cb-cc38-4cdd-a0f0-1077410b1768/volumes" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.670432 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" path="/var/lib/kubelet/pods/84c40408-c638-4bea-86d5-fb40a60b6975/volumes" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.958857 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d8d4bb8d-87zwm" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:56738->10.217.0.166:9311: read: connection reset by peer" Feb 28 09:20:52 crc kubenswrapper[4687]: I0228 09:20:52.958938 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d8d4bb8d-87zwm" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:56748->10.217.0.166:9311: read: connection reset by peer" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.228712 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bd77ccf75-bqx56" event={"ID":"d655bdf4-33ab-45fa-b1e4-c37aede5609a","Type":"ContainerDied","Data":"c148b3a169846cd28c277934cdaf8f10f03c20cf3471050301bc5785ff1c3420"} Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.228843 4687 scope.go:117] "RemoveContainer" containerID="1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.229128 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bd77ccf75-bqx56" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.237283 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerStarted","Data":"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d"} Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.237495 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.240440 4687 generic.go:334] "Generic (PLEG): container finished" podID="aa88f1b2-477c-461c-a044-88fd35c31231" containerID="b1e3eae08f7f2f92c12748feeba374abf768df6a9244158975b18d0b40305051" exitCode=0 Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.240481 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8d4bb8d-87zwm" event={"ID":"aa88f1b2-477c-461c-a044-88fd35c31231","Type":"ContainerDied","Data":"b1e3eae08f7f2f92c12748feeba374abf768df6a9244158975b18d0b40305051"} Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.281010 4687 scope.go:117] "RemoveContainer" containerID="51b219e86f3b0d6b4919b070002226d15fce4b8fe16494e79bab096be1e39e20" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.288501 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bd77ccf75-bqx56"] Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.296120 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bd77ccf75-bqx56"] Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.300856 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.122961674 podStartE2EDuration="6.300837696s" podCreationTimestamp="2026-02-28 09:20:47 +0000 UTC" firstStartedPulling="2026-02-28 09:20:48.050534877 +0000 UTC m=+1039.741104214" lastFinishedPulling="2026-02-28 09:20:52.228410899 +0000 UTC m=+1043.918980236" observedRunningTime="2026-02-28 09:20:53.272249389 +0000 UTC m=+1044.962818726" watchObservedRunningTime="2026-02-28 09:20:53.300837696 +0000 UTC m=+1044.991407032" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.394380 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.455804 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data-custom\") pod \"aa88f1b2-477c-461c-a044-88fd35c31231\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.455944 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxdn6\" (UniqueName: \"kubernetes.io/projected/aa88f1b2-477c-461c-a044-88fd35c31231-kube-api-access-lxdn6\") pod \"aa88f1b2-477c-461c-a044-88fd35c31231\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.456136 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-combined-ca-bundle\") pod \"aa88f1b2-477c-461c-a044-88fd35c31231\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.456675 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data\") pod \"aa88f1b2-477c-461c-a044-88fd35c31231\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.456709 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa88f1b2-477c-461c-a044-88fd35c31231-logs\") pod \"aa88f1b2-477c-461c-a044-88fd35c31231\" (UID: \"aa88f1b2-477c-461c-a044-88fd35c31231\") " Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.458211 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa88f1b2-477c-461c-a044-88fd35c31231-logs" (OuterVolumeSpecName: "logs") pod "aa88f1b2-477c-461c-a044-88fd35c31231" (UID: "aa88f1b2-477c-461c-a044-88fd35c31231"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.463567 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa88f1b2-477c-461c-a044-88fd35c31231" (UID: "aa88f1b2-477c-461c-a044-88fd35c31231"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.463770 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa88f1b2-477c-461c-a044-88fd35c31231-kube-api-access-lxdn6" (OuterVolumeSpecName: "kube-api-access-lxdn6") pod "aa88f1b2-477c-461c-a044-88fd35c31231" (UID: "aa88f1b2-477c-461c-a044-88fd35c31231"). InnerVolumeSpecName "kube-api-access-lxdn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.479940 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa88f1b2-477c-461c-a044-88fd35c31231" (UID: "aa88f1b2-477c-461c-a044-88fd35c31231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.498391 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data" (OuterVolumeSpecName: "config-data") pod "aa88f1b2-477c-461c-a044-88fd35c31231" (UID: "aa88f1b2-477c-461c-a044-88fd35c31231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.559416 4687 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.559462 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxdn6\" (UniqueName: \"kubernetes.io/projected/aa88f1b2-477c-461c-a044-88fd35c31231-kube-api-access-lxdn6\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.559476 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.559485 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa88f1b2-477c-461c-a044-88fd35c31231-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:53 crc kubenswrapper[4687]: I0228 09:20:53.559493 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa88f1b2-477c-461c-a044-88fd35c31231-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.172070 4687 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/8b4e21b16e9980c28089d41f84b2cd834fbd011988d7d74a1d779c81efdf391c/diff" to get inode usage: stat /var/lib/containers/storage/overlay/8b4e21b16e9980c28089d41f84b2cd834fbd011988d7d74a1d779c81efdf391c/diff: no such file or directory, extraDiskErr: Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.223532 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.252426 4687 generic.go:334] "Generic (PLEG): container finished" podID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerID="76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09" exitCode=0 Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.252872 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b82a8aed-cc7b-4802-80f0-63e701ee0593","Type":"ContainerDied","Data":"76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09"} Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.252902 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b82a8aed-cc7b-4802-80f0-63e701ee0593","Type":"ContainerDied","Data":"2cbfd448dfbfd7ff8b1c920092f8eb02bd9ac026d2b29b479aec909800756385"} Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.253257 4687 scope.go:117] "RemoveContainer" containerID="85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.253352 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.266310 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8d4bb8d-87zwm" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.267247 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8d4bb8d-87zwm" event={"ID":"aa88f1b2-477c-461c-a044-88fd35c31231","Type":"ContainerDied","Data":"20c3398c07b4a58ec0440d2377669a98032c546a99db3287a0bf00e017a3e7b6"} Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.271503 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82a8aed-cc7b-4802-80f0-63e701ee0593-etc-machine-id\") pod \"b82a8aed-cc7b-4802-80f0-63e701ee0593\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.271559 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data-custom\") pod \"b82a8aed-cc7b-4802-80f0-63e701ee0593\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.271618 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data\") pod \"b82a8aed-cc7b-4802-80f0-63e701ee0593\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.271681 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-combined-ca-bundle\") pod \"b82a8aed-cc7b-4802-80f0-63e701ee0593\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.271699 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b82a8aed-cc7b-4802-80f0-63e701ee0593-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b82a8aed-cc7b-4802-80f0-63e701ee0593" (UID: "b82a8aed-cc7b-4802-80f0-63e701ee0593"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.272403 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-scripts\") pod \"b82a8aed-cc7b-4802-80f0-63e701ee0593\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.272450 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6k8f\" (UniqueName: \"kubernetes.io/projected/b82a8aed-cc7b-4802-80f0-63e701ee0593-kube-api-access-h6k8f\") pod \"b82a8aed-cc7b-4802-80f0-63e701ee0593\" (UID: \"b82a8aed-cc7b-4802-80f0-63e701ee0593\") " Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.273090 4687 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82a8aed-cc7b-4802-80f0-63e701ee0593-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.278732 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b82a8aed-cc7b-4802-80f0-63e701ee0593" (UID: "b82a8aed-cc7b-4802-80f0-63e701ee0593"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.289172 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82a8aed-cc7b-4802-80f0-63e701ee0593-kube-api-access-h6k8f" (OuterVolumeSpecName: "kube-api-access-h6k8f") pod "b82a8aed-cc7b-4802-80f0-63e701ee0593" (UID: "b82a8aed-cc7b-4802-80f0-63e701ee0593"). InnerVolumeSpecName "kube-api-access-h6k8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.289292 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-scripts" (OuterVolumeSpecName: "scripts") pod "b82a8aed-cc7b-4802-80f0-63e701ee0593" (UID: "b82a8aed-cc7b-4802-80f0-63e701ee0593"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.291192 4687 scope.go:117] "RemoveContainer" containerID="76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.325225 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b82a8aed-cc7b-4802-80f0-63e701ee0593" (UID: "b82a8aed-cc7b-4802-80f0-63e701ee0593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.338612 4687 scope.go:117] "RemoveContainer" containerID="85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.339030 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841\": container with ID starting with 85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841 not found: ID does not exist" containerID="85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.339085 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841"} err="failed to get container status \"85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841\": rpc error: code = NotFound desc = could not find container \"85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841\": container with ID starting with 85a9619d18290d62f51f4d9dba3f1999673011b6aeca5038cb9261a82771a841 not found: ID does not exist" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.339115 4687 scope.go:117] "RemoveContainer" containerID="76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.339587 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09\": container with ID starting with 76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09 not found: ID does not exist" containerID="76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.339617 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09"} err="failed to get container status \"76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09\": rpc error: code = NotFound desc = could not find container \"76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09\": container with ID starting with 76ec839f85509f3b4f49b9c4eca232d78d0752975887739b09646d96f7698d09 not found: ID does not exist" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.339637 4687 scope.go:117] "RemoveContainer" containerID="b1e3eae08f7f2f92c12748feeba374abf768df6a9244158975b18d0b40305051" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.341615 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d8d4bb8d-87zwm"] Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.348077 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d8d4bb8d-87zwm"] Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.368146 4687 scope.go:117] "RemoveContainer" containerID="a9fe336de21c280d3de01f19fc9fddfa3d280f1561ff146cf049d7465c378b2d" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.375080 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.375104 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6k8f\" (UniqueName: \"kubernetes.io/projected/b82a8aed-cc7b-4802-80f0-63e701ee0593-kube-api-access-h6k8f\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.375116 4687 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.375124 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.381508 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data" (OuterVolumeSpecName: "config-data") pod "b82a8aed-cc7b-4802-80f0-63e701ee0593" (UID: "b82a8aed-cc7b-4802-80f0-63e701ee0593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.477744 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82a8aed-cc7b-4802-80f0-63e701ee0593-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.633802 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.643141 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.681216 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" path="/var/lib/kubelet/pods/aa88f1b2-477c-461c-a044-88fd35c31231/volumes" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.682140 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" path="/var/lib/kubelet/pods/b82a8aed-cc7b-4802-80f0-63e701ee0593/volumes" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.682888 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" path="/var/lib/kubelet/pods/d655bdf4-33ab-45fa-b1e4-c37aede5609a/volumes" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.709718 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710354 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerName="dnsmasq-dns" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710374 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerName="dnsmasq-dns" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710404 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-httpd" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710412 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-httpd" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710439 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710445 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710457 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710463 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api-log" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710473 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-api" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710480 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-api" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710487 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="probe" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710493 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="probe" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710503 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="cinder-scheduler" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710510 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="cinder-scheduler" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710525 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710530 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710540 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710547 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710556 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerName="init" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710562 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerName="init" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710575 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710581 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710591 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710597 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710605 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710611 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api" Feb 28 09:20:54 crc kubenswrapper[4687]: E0228 09:20:54.710621 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710629 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710849 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710865 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710872 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710886 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="455f8be2-a725-49fb-ba76-6f3e6c4cb34d" containerName="dnsmasq-dns" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710893 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-httpd" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710905 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f683cb-cc38-4cdd-a0f0-1077410b1768" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710916 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="27799696-4eb6-4ef9-9440-151a3929d699" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710928 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710938 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="probe" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710946 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d655bdf4-33ab-45fa-b1e4-c37aede5609a" containerName="neutron-api" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710957 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c40408-c638-4bea-86d5-fb40a60b6975" containerName="horizon-log" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710963 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa88f1b2-477c-461c-a044-88fd35c31231" containerName="barbican-api" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.710970 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82a8aed-cc7b-4802-80f0-63e701ee0593" containerName="cinder-scheduler" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.712173 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.714679 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.726263 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.792733 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-config-data\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.792802 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qxl\" (UniqueName: \"kubernetes.io/projected/0e9f0b9e-618d-409d-b76f-5da56783af17-kube-api-access-82qxl\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.792886 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.793492 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-scripts\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.793744 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.793781 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e9f0b9e-618d-409d-b76f-5da56783af17-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896489 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896549 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e9f0b9e-618d-409d-b76f-5da56783af17-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896686 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-config-data\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896721 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82qxl\" (UniqueName: \"kubernetes.io/projected/0e9f0b9e-618d-409d-b76f-5da56783af17-kube-api-access-82qxl\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896752 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e9f0b9e-618d-409d-b76f-5da56783af17-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896849 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.896975 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-scripts\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.901514 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.901544 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-scripts\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.901643 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-config-data\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.902153 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e9f0b9e-618d-409d-b76f-5da56783af17-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:54 crc kubenswrapper[4687]: I0228 09:20:54.911308 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qxl\" (UniqueName: \"kubernetes.io/projected/0e9f0b9e-618d-409d-b76f-5da56783af17-kube-api-access-82qxl\") pod \"cinder-scheduler-0\" (UID: \"0e9f0b9e-618d-409d-b76f-5da56783af17\") " pod="openstack/cinder-scheduler-0" Feb 28 09:20:55 crc kubenswrapper[4687]: I0228 09:20:55.031807 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 28 09:20:55 crc kubenswrapper[4687]: I0228 09:20:55.511946 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 28 09:20:56 crc kubenswrapper[4687]: I0228 09:20:56.283452 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0e9f0b9e-618d-409d-b76f-5da56783af17","Type":"ContainerStarted","Data":"306f4ee39f522b91fe62d6071a9eae0366ea08609bb306c6ed122a1bdc2e0470"} Feb 28 09:20:56 crc kubenswrapper[4687]: I0228 09:20:56.283852 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0e9f0b9e-618d-409d-b76f-5da56783af17","Type":"ContainerStarted","Data":"97040225eacb3042da474423bf2bb582ca88486c30aef756cabd9dc604f8fd6f"} Feb 28 09:20:56 crc kubenswrapper[4687]: E0228 09:20:56.292832 4687 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/6944d6f387dca507e78de4da1ab3a636507661250681d35cc86dfd7cdadbac1b/diff" to get inode usage: stat /var/lib/containers/storage/overlay/6944d6f387dca507e78de4da1ab3a636507661250681d35cc86dfd7cdadbac1b/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-ccd7c9f8f-bw8wq_baee8d66-1152-499a-9e04-1c58353c4651/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-ccd7c9f8f-bw8wq_baee8d66-1152-499a-9e04-1c58353c4651/dnsmasq-dns/0.log: no such file or directory Feb 28 09:20:57 crc kubenswrapper[4687]: I0228 09:20:57.297863 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0e9f0b9e-618d-409d-b76f-5da56783af17","Type":"ContainerStarted","Data":"8089c0f6eeee47d111d143435bc0392d83b2687388f362a8b7af2619c36d556f"} Feb 28 09:20:57 crc kubenswrapper[4687]: I0228 09:20:57.316242 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.316224043 podStartE2EDuration="3.316224043s" podCreationTimestamp="2026-02-28 09:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:20:57.311415501 +0000 UTC m=+1049.001984838" watchObservedRunningTime="2026-02-28 09:20:57.316224043 +0000 UTC m=+1049.006793370" Feb 28 09:20:58 crc kubenswrapper[4687]: I0228 09:20:58.265200 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 28 09:20:59 crc kubenswrapper[4687]: I0228 09:20:59.523980 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:20:59 crc kubenswrapper[4687]: I0228 09:20:59.530812 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:21:00 crc kubenswrapper[4687]: I0228 09:21:00.032425 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 28 09:21:00 crc kubenswrapper[4687]: I0228 09:21:00.201931 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8685d6f5dd-ndtlf" Feb 28 09:21:00 crc kubenswrapper[4687]: I0228 09:21:00.204224 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 28 09:21:02 crc kubenswrapper[4687]: I0228 09:21:02.077084 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.326306 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-fdfb795c-sf6nb"] Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.328160 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.332009 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.332685 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.332795 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.340794 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-fdfb795c-sf6nb"] Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387419 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-combined-ca-bundle\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387461 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-public-tls-certs\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387554 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10b30927-e15b-4464-b5e4-1245c90ce5f8-run-httpd\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387591 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-config-data\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387757 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsfkv\" (UniqueName: \"kubernetes.io/projected/10b30927-e15b-4464-b5e4-1245c90ce5f8-kube-api-access-vsfkv\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387909 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-internal-tls-certs\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.387993 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10b30927-e15b-4464-b5e4-1245c90ce5f8-log-httpd\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.388094 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/10b30927-e15b-4464-b5e4-1245c90ce5f8-etc-swift\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.491227 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-combined-ca-bundle\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.491278 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-public-tls-certs\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.491329 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10b30927-e15b-4464-b5e4-1245c90ce5f8-run-httpd\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.491621 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-config-data\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.491871 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsfkv\" (UniqueName: \"kubernetes.io/projected/10b30927-e15b-4464-b5e4-1245c90ce5f8-kube-api-access-vsfkv\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.491979 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10b30927-e15b-4464-b5e4-1245c90ce5f8-run-httpd\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.492095 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-internal-tls-certs\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.492206 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10b30927-e15b-4464-b5e4-1245c90ce5f8-log-httpd\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.492284 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/10b30927-e15b-4464-b5e4-1245c90ce5f8-etc-swift\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.493533 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10b30927-e15b-4464-b5e4-1245c90ce5f8-log-httpd\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.500055 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-public-tls-certs\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.500097 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-internal-tls-certs\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.500727 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-combined-ca-bundle\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.501626 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/10b30927-e15b-4464-b5e4-1245c90ce5f8-etc-swift\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.509786 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10b30927-e15b-4464-b5e4-1245c90ce5f8-config-data\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.523855 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsfkv\" (UniqueName: \"kubernetes.io/projected/10b30927-e15b-4464-b5e4-1245c90ce5f8-kube-api-access-vsfkv\") pod \"swift-proxy-fdfb795c-sf6nb\" (UID: \"10b30927-e15b-4464-b5e4-1245c90ce5f8\") " pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:03 crc kubenswrapper[4687]: I0228 09:21:03.647387 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.154138 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-fdfb795c-sf6nb"] Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.359081 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fdfb795c-sf6nb" event={"ID":"10b30927-e15b-4464-b5e4-1245c90ce5f8","Type":"ContainerStarted","Data":"df2637ae82731c7bc4c0b66108825bb5cd379cf40672a4bd87d92c8998c8bb05"} Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.359415 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fdfb795c-sf6nb" event={"ID":"10b30927-e15b-4464-b5e4-1245c90ce5f8","Type":"ContainerStarted","Data":"49ea80648ac5f3bbdd02d9f665d0c7e7e7c4ddf3f1f753b4f5b1962c22f1abb7"} Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.636628 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.682785 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d6696bd5b-vf747" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.736310 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f6cfc745b-qklfm"] Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.736537 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f6cfc745b-qklfm" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-log" containerID="cri-o://f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729" gracePeriod=30 Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.736695 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f6cfc745b-qklfm" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-api" containerID="cri-o://b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62" gracePeriod=30 Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.790984 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.795557 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.797681 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zdw82" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.797765 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.803260 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.809992 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.921375 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-openstack-config\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.921426 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-openstack-config-secret\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.921738 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:04 crc kubenswrapper[4687]: I0228 09:21:04.922150 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5vl\" (UniqueName: \"kubernetes.io/projected/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-kube-api-access-mt5vl\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.024479 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-openstack-config\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.024530 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-openstack-config-secret\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.024591 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.024855 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5vl\" (UniqueName: \"kubernetes.io/projected/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-kube-api-access-mt5vl\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.029386 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-openstack-config-secret\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.030515 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-openstack-config\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.032276 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.040898 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5vl\" (UniqueName: \"kubernetes.io/projected/4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39-kube-api-access-mt5vl\") pod \"openstackclient\" (UID: \"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39\") " pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.128543 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.369787 4687 generic.go:334] "Generic (PLEG): container finished" podID="42eabfaf-28a5-4986-ad88-a93859225843" containerID="f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729" exitCode=143 Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.369841 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f6cfc745b-qklfm" event={"ID":"42eabfaf-28a5-4986-ad88-a93859225843","Type":"ContainerDied","Data":"f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729"} Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.372121 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fdfb795c-sf6nb" event={"ID":"10b30927-e15b-4464-b5e4-1245c90ce5f8","Type":"ContainerStarted","Data":"999142d1d32d0f78ba52521226c943533b0e92dd95db2f0e7fd5328715c31e4f"} Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.399227 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-fdfb795c-sf6nb" podStartSLOduration=2.399208876 podStartE2EDuration="2.399208876s" podCreationTimestamp="2026-02-28 09:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:05.387359595 +0000 UTC m=+1057.077928942" watchObservedRunningTime="2026-02-28 09:21:05.399208876 +0000 UTC m=+1057.089778214" Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.578684 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.625150 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.625474 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-central-agent" containerID="cri-o://f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" gracePeriod=30 Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.625625 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="proxy-httpd" containerID="cri-o://d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" gracePeriod=30 Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.625681 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="sg-core" containerID="cri-o://10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" gracePeriod=30 Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.625716 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-notification-agent" containerID="cri-o://351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" gracePeriod=30 Feb 28 09:21:05 crc kubenswrapper[4687]: I0228 09:21:05.637478 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.379188 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.393102 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39","Type":"ContainerStarted","Data":"24c2e4d927c0397aae7b552e6e539e439bc4d024f3ec431aaf70990db9eea169"} Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.397961 4687 generic.go:334] "Generic (PLEG): container finished" podID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" exitCode=0 Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398007 4687 generic.go:334] "Generic (PLEG): container finished" podID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" exitCode=2 Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398037 4687 generic.go:334] "Generic (PLEG): container finished" podID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" exitCode=0 Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398046 4687 generic.go:334] "Generic (PLEG): container finished" podID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" exitCode=0 Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398250 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerDied","Data":"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d"} Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398297 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerDied","Data":"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2"} Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398309 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerDied","Data":"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743"} Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398318 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerDied","Data":"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795"} Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398327 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53078962-6c8c-436e-8d57-e2ed7e9e2b6e","Type":"ContainerDied","Data":"6f04d7668d3a0f6c9e5c4526624926cdd5fefdd5b3f4afa0334344c5ce1f2d8e"} Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398347 4687 scope.go:117] "RemoveContainer" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.398477 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.399437 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.399468 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.435290 4687 scope.go:117] "RemoveContainer" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.473795 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm7vl\" (UniqueName: \"kubernetes.io/projected/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-kube-api-access-hm7vl\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.473828 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-sg-core-conf-yaml\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.473859 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-log-httpd\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.473892 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-scripts\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.474857 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-combined-ca-bundle\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.474913 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-config-data\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.474955 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-run-httpd\") pod \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\" (UID: \"53078962-6c8c-436e-8d57-e2ed7e9e2b6e\") " Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.476149 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.476456 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.481720 4687 scope.go:117] "RemoveContainer" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.482513 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-kube-api-access-hm7vl" (OuterVolumeSpecName: "kube-api-access-hm7vl") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "kube-api-access-hm7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.483161 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-scripts" (OuterVolumeSpecName: "scripts") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.497747 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.499718 4687 scope.go:117] "RemoveContainer" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.543286 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.555425 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-config-data" (OuterVolumeSpecName: "config-data") pod "53078962-6c8c-436e-8d57-e2ed7e9e2b6e" (UID: "53078962-6c8c-436e-8d57-e2ed7e9e2b6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577112 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577137 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577149 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577158 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577167 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm7vl\" (UniqueName: \"kubernetes.io/projected/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-kube-api-access-hm7vl\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577177 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.577185 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53078962-6c8c-436e-8d57-e2ed7e9e2b6e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.609100 4687 scope.go:117] "RemoveContainer" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.609499 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": container with ID starting with d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d not found: ID does not exist" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.609532 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d"} err="failed to get container status \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": rpc error: code = NotFound desc = could not find container \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": container with ID starting with d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.609554 4687 scope.go:117] "RemoveContainer" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.609869 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": container with ID starting with 10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2 not found: ID does not exist" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.609893 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2"} err="failed to get container status \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": rpc error: code = NotFound desc = could not find container \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": container with ID starting with 10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.609906 4687 scope.go:117] "RemoveContainer" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.610217 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": container with ID starting with 351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743 not found: ID does not exist" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.610257 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743"} err="failed to get container status \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": rpc error: code = NotFound desc = could not find container \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": container with ID starting with 351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.610283 4687 scope.go:117] "RemoveContainer" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.610547 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": container with ID starting with f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795 not found: ID does not exist" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.610571 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795"} err="failed to get container status \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": rpc error: code = NotFound desc = could not find container \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": container with ID starting with f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.610592 4687 scope.go:117] "RemoveContainer" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.610828 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d"} err="failed to get container status \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": rpc error: code = NotFound desc = could not find container \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": container with ID starting with d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.610847 4687 scope.go:117] "RemoveContainer" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.611001 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2"} err="failed to get container status \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": rpc error: code = NotFound desc = could not find container \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": container with ID starting with 10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.611033 4687 scope.go:117] "RemoveContainer" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.611627 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743"} err="failed to get container status \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": rpc error: code = NotFound desc = could not find container \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": container with ID starting with 351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.611646 4687 scope.go:117] "RemoveContainer" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.611939 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795"} err="failed to get container status \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": rpc error: code = NotFound desc = could not find container \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": container with ID starting with f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.611960 4687 scope.go:117] "RemoveContainer" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.612256 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d"} err="failed to get container status \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": rpc error: code = NotFound desc = could not find container \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": container with ID starting with d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.612276 4687 scope.go:117] "RemoveContainer" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.612520 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2"} err="failed to get container status \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": rpc error: code = NotFound desc = could not find container \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": container with ID starting with 10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.612539 4687 scope.go:117] "RemoveContainer" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.612794 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743"} err="failed to get container status \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": rpc error: code = NotFound desc = could not find container \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": container with ID starting with 351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.612813 4687 scope.go:117] "RemoveContainer" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613115 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795"} err="failed to get container status \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": rpc error: code = NotFound desc = could not find container \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": container with ID starting with f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613134 4687 scope.go:117] "RemoveContainer" containerID="d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613451 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d"} err="failed to get container status \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": rpc error: code = NotFound desc = could not find container \"d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d\": container with ID starting with d2e929caa621ef16666a19be158c1205efbd50d6500ffc8b66ab52762d90da4d not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613469 4687 scope.go:117] "RemoveContainer" containerID="10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613651 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2"} err="failed to get container status \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": rpc error: code = NotFound desc = could not find container \"10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2\": container with ID starting with 10225e99dcc5dc48a50f28b67ed459d7e34a0b6e8022a2df2bc70136ca4096d2 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613670 4687 scope.go:117] "RemoveContainer" containerID="351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613898 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743"} err="failed to get container status \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": rpc error: code = NotFound desc = could not find container \"351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743\": container with ID starting with 351a890dbd5534d0e5bb94fbd798b9c8795baf5a10a3fa0a4ae64dfe985d4743 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.613916 4687 scope.go:117] "RemoveContainer" containerID="f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.614237 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795"} err="failed to get container status \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": rpc error: code = NotFound desc = could not find container \"f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795\": container with ID starting with f3573c79ddfe507a33e16efb159ea6cd75815da3cd6e2d48421cf838a6756795 not found: ID does not exist" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.730330 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.746624 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764092 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.764603 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-notification-agent" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764627 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-notification-agent" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.764659 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="sg-core" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764667 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="sg-core" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.764676 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="proxy-httpd" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764684 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="proxy-httpd" Feb 28 09:21:06 crc kubenswrapper[4687]: E0228 09:21:06.764698 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-central-agent" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764705 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-central-agent" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764908 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="proxy-httpd" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764930 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="sg-core" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764945 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-central-agent" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.764956 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" containerName="ceilometer-notification-agent" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.768168 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.770203 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.770404 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.773492 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.883591 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.883661 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlkkh\" (UniqueName: \"kubernetes.io/projected/55473b1d-46e6-4ee3-953b-36013758c6e8-kube-api-access-zlkkh\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.883727 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-log-httpd\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.883991 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-run-httpd\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.884274 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-scripts\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.884480 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-config-data\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.884692 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987121 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-run-httpd\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987183 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-scripts\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987208 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-config-data\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987246 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987278 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987297 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlkkh\" (UniqueName: \"kubernetes.io/projected/55473b1d-46e6-4ee3-953b-36013758c6e8-kube-api-access-zlkkh\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987321 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-log-httpd\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987751 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-log-httpd\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.987952 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-run-httpd\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.992481 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.992839 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-config-data\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.992843 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:06 crc kubenswrapper[4687]: I0228 09:21:06.996644 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-scripts\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:07 crc kubenswrapper[4687]: I0228 09:21:07.008717 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlkkh\" (UniqueName: \"kubernetes.io/projected/55473b1d-46e6-4ee3-953b-36013758c6e8-kube-api-access-zlkkh\") pod \"ceilometer-0\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " pod="openstack/ceilometer-0" Feb 28 09:21:07 crc kubenswrapper[4687]: I0228 09:21:07.089762 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:07 crc kubenswrapper[4687]: I0228 09:21:07.517160 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.254809 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.317815 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-scripts\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318102 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42eabfaf-28a5-4986-ad88-a93859225843-logs\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318165 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m42qm\" (UniqueName: \"kubernetes.io/projected/42eabfaf-28a5-4986-ad88-a93859225843-kube-api-access-m42qm\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318188 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-combined-ca-bundle\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318222 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-public-tls-certs\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318287 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-config-data\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318453 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42eabfaf-28a5-4986-ad88-a93859225843-logs" (OuterVolumeSpecName: "logs") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318574 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-internal-tls-certs\") pod \"42eabfaf-28a5-4986-ad88-a93859225843\" (UID: \"42eabfaf-28a5-4986-ad88-a93859225843\") " Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.318997 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42eabfaf-28a5-4986-ad88-a93859225843-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.322406 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42eabfaf-28a5-4986-ad88-a93859225843-kube-api-access-m42qm" (OuterVolumeSpecName: "kube-api-access-m42qm") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "kube-api-access-m42qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.327919 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-scripts" (OuterVolumeSpecName: "scripts") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.366694 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.367488 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-config-data" (OuterVolumeSpecName: "config-data") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.396381 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.411743 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "42eabfaf-28a5-4986-ad88-a93859225843" (UID: "42eabfaf-28a5-4986-ad88-a93859225843"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.417288 4687 generic.go:334] "Generic (PLEG): container finished" podID="42eabfaf-28a5-4986-ad88-a93859225843" containerID="b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62" exitCode=0 Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.417350 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f6cfc745b-qklfm" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.417369 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f6cfc745b-qklfm" event={"ID":"42eabfaf-28a5-4986-ad88-a93859225843","Type":"ContainerDied","Data":"b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62"} Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.417410 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f6cfc745b-qklfm" event={"ID":"42eabfaf-28a5-4986-ad88-a93859225843","Type":"ContainerDied","Data":"bb33055feadf667676e8af073e956194ad6a1d84e4fbdb30ed526dd9a37339b7"} Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.417431 4687 scope.go:117] "RemoveContainer" containerID="b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421065 4687 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421095 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421108 4687 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421119 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421129 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m42qm\" (UniqueName: \"kubernetes.io/projected/42eabfaf-28a5-4986-ad88-a93859225843-kube-api-access-m42qm\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421139 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42eabfaf-28a5-4986-ad88-a93859225843-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421184 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerStarted","Data":"4d6f76835d7dff86c5ab16f423bf593a4f1d175af0596304247ae50231fa48a5"} Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.421220 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerStarted","Data":"1c895ced0e7308a7f04bf915a8cb013d135a2dbc10eec3bfdfdc15245c0df893"} Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.447475 4687 scope.go:117] "RemoveContainer" containerID="f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.447883 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f6cfc745b-qklfm"] Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.454316 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6f6cfc745b-qklfm"] Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.467958 4687 scope.go:117] "RemoveContainer" containerID="b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62" Feb 28 09:21:08 crc kubenswrapper[4687]: E0228 09:21:08.468945 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62\": container with ID starting with b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62 not found: ID does not exist" containerID="b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.471398 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62"} err="failed to get container status \"b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62\": rpc error: code = NotFound desc = could not find container \"b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62\": container with ID starting with b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62 not found: ID does not exist" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.471443 4687 scope.go:117] "RemoveContainer" containerID="f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729" Feb 28 09:21:08 crc kubenswrapper[4687]: E0228 09:21:08.471752 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729\": container with ID starting with f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729 not found: ID does not exist" containerID="f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.471779 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729"} err="failed to get container status \"f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729\": rpc error: code = NotFound desc = could not find container \"f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729\": container with ID starting with f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729 not found: ID does not exist" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.671419 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42eabfaf-28a5-4986-ad88-a93859225843" path="/var/lib/kubelet/pods/42eabfaf-28a5-4986-ad88-a93859225843/volumes" Feb 28 09:21:08 crc kubenswrapper[4687]: I0228 09:21:08.672127 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53078962-6c8c-436e-8d57-e2ed7e9e2b6e" path="/var/lib/kubelet/pods/53078962-6c8c-436e-8d57-e2ed7e9e2b6e/volumes" Feb 28 09:21:09 crc kubenswrapper[4687]: I0228 09:21:09.430933 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerStarted","Data":"3c5a37cb15025e2ea9da64019e231339514fd05d502893554d2668b276e265be"} Feb 28 09:21:10 crc kubenswrapper[4687]: I0228 09:21:10.444280 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerStarted","Data":"7328c614e681d041cb5298aaf3c060ab486ec99e7e55c7aea3787f82a35a56a5"} Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.362726 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bd86ccc79-8jlb2" Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.420664 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b9c5b669b-xd8lz"] Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.421094 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b9c5b669b-xd8lz" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-api" containerID="cri-o://03244fa16f2b84c19f33830379f405964557d73d6134657ac158003cd9026866" gracePeriod=30 Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.421449 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b9c5b669b-xd8lz" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-httpd" containerID="cri-o://8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce" gracePeriod=30 Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.471714 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerStarted","Data":"4656e89b07a7ea7b271db19827aae6edf0c11a8f0f6bf7fc313827ad223c9711"} Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.471866 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:21:11 crc kubenswrapper[4687]: I0228 09:21:11.496077 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8619316399999999 podStartE2EDuration="5.49605434s" podCreationTimestamp="2026-02-28 09:21:06 +0000 UTC" firstStartedPulling="2026-02-28 09:21:07.538756783 +0000 UTC m=+1059.229326120" lastFinishedPulling="2026-02-28 09:21:11.172879483 +0000 UTC m=+1062.863448820" observedRunningTime="2026-02-28 09:21:11.492120874 +0000 UTC m=+1063.182690211" watchObservedRunningTime="2026-02-28 09:21:11.49605434 +0000 UTC m=+1063.186623678" Feb 28 09:21:12 crc kubenswrapper[4687]: I0228 09:21:12.077554 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d58956cb6-f8plp" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 28 09:21:12 crc kubenswrapper[4687]: I0228 09:21:12.487011 4687 generic.go:334] "Generic (PLEG): container finished" podID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerID="8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce" exitCode=0 Feb 28 09:21:12 crc kubenswrapper[4687]: I0228 09:21:12.487096 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9c5b669b-xd8lz" event={"ID":"0859ec96-842c-472a-be1b-f29c8f1df2d9","Type":"ContainerDied","Data":"8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce"} Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.656788 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.657050 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-fdfb795c-sf6nb" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.815444 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6qwj6"] Feb 28 09:21:13 crc kubenswrapper[4687]: E0228 09:21:13.815905 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-log" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.815980 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-log" Feb 28 09:21:13 crc kubenswrapper[4687]: E0228 09:21:13.816117 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-api" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.816200 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-api" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.816416 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-log" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.816494 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="42eabfaf-28a5-4986-ad88-a93859225843" containerName="placement-api" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.817137 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.836796 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6qwj6"] Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.946617 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb848\" (UniqueName: \"kubernetes.io/projected/9159f256-61e8-41bc-bceb-d602b568ef60-kube-api-access-pb848\") pod \"nova-api-db-create-6qwj6\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.946712 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159f256-61e8-41bc-bceb-d602b568ef60-operator-scripts\") pod \"nova-api-db-create-6qwj6\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.998048 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.998316 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-central-agent" containerID="cri-o://4d6f76835d7dff86c5ab16f423bf593a4f1d175af0596304247ae50231fa48a5" gracePeriod=30 Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.998441 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-notification-agent" containerID="cri-o://3c5a37cb15025e2ea9da64019e231339514fd05d502893554d2668b276e265be" gracePeriod=30 Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.998465 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="sg-core" containerID="cri-o://7328c614e681d041cb5298aaf3c060ab486ec99e7e55c7aea3787f82a35a56a5" gracePeriod=30 Feb 28 09:21:13 crc kubenswrapper[4687]: I0228 09:21:13.998455 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="proxy-httpd" containerID="cri-o://4656e89b07a7ea7b271db19827aae6edf0c11a8f0f6bf7fc313827ad223c9711" gracePeriod=30 Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.029916 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-86rbf"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.031321 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.040592 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d297-account-create-update-688sh"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.041838 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.044579 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.048704 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-86rbf"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.049976 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159f256-61e8-41bc-bceb-d602b568ef60-operator-scripts\") pod \"nova-api-db-create-6qwj6\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.050127 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb848\" (UniqueName: \"kubernetes.io/projected/9159f256-61e8-41bc-bceb-d602b568ef60-kube-api-access-pb848\") pod \"nova-api-db-create-6qwj6\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.051148 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159f256-61e8-41bc-bceb-d602b568ef60-operator-scripts\") pod \"nova-api-db-create-6qwj6\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.057621 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d297-account-create-update-688sh"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.086706 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb848\" (UniqueName: \"kubernetes.io/projected/9159f256-61e8-41bc-bceb-d602b568ef60-kube-api-access-pb848\") pod \"nova-api-db-create-6qwj6\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.137421 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.141079 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fzwm9"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.142462 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.149628 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fzwm9"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.153518 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5777779d-582f-4e60-ac7f-e194408c31eb-operator-scripts\") pod \"nova-cell0-db-create-86rbf\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.153681 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcaf528f-cd30-4024-b73f-da1ac741ee53-operator-scripts\") pod \"nova-api-d297-account-create-update-688sh\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.153747 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6kkd\" (UniqueName: \"kubernetes.io/projected/fcaf528f-cd30-4024-b73f-da1ac741ee53-kube-api-access-r6kkd\") pod \"nova-api-d297-account-create-update-688sh\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.153972 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8wg\" (UniqueName: \"kubernetes.io/projected/5777779d-582f-4e60-ac7f-e194408c31eb-kube-api-access-hg8wg\") pod \"nova-cell0-db-create-86rbf\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.230954 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a257-account-create-update-9t7cz"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.233280 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.236639 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.246379 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a257-account-create-update-9t7cz"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.255819 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5777779d-582f-4e60-ac7f-e194408c31eb-operator-scripts\") pod \"nova-cell0-db-create-86rbf\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.255909 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcaf528f-cd30-4024-b73f-da1ac741ee53-operator-scripts\") pod \"nova-api-d297-account-create-update-688sh\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.255951 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-operator-scripts\") pod \"nova-cell1-db-create-fzwm9\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.255978 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6kkd\" (UniqueName: \"kubernetes.io/projected/fcaf528f-cd30-4024-b73f-da1ac741ee53-kube-api-access-r6kkd\") pod \"nova-api-d297-account-create-update-688sh\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.256008 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9cv5\" (UniqueName: \"kubernetes.io/projected/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-kube-api-access-l9cv5\") pod \"nova-cell1-db-create-fzwm9\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.256131 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8wg\" (UniqueName: \"kubernetes.io/projected/5777779d-582f-4e60-ac7f-e194408c31eb-kube-api-access-hg8wg\") pod \"nova-cell0-db-create-86rbf\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.257013 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5777779d-582f-4e60-ac7f-e194408c31eb-operator-scripts\") pod \"nova-cell0-db-create-86rbf\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.257663 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcaf528f-cd30-4024-b73f-da1ac741ee53-operator-scripts\") pod \"nova-api-d297-account-create-update-688sh\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.279082 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8wg\" (UniqueName: \"kubernetes.io/projected/5777779d-582f-4e60-ac7f-e194408c31eb-kube-api-access-hg8wg\") pod \"nova-cell0-db-create-86rbf\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.295070 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6kkd\" (UniqueName: \"kubernetes.io/projected/fcaf528f-cd30-4024-b73f-da1ac741ee53-kube-api-access-r6kkd\") pod \"nova-api-d297-account-create-update-688sh\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.359625 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8523b7a8-45d6-4708-b1e7-4c3dbb505640-operator-scripts\") pod \"nova-cell0-a257-account-create-update-9t7cz\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.359746 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-operator-scripts\") pod \"nova-cell1-db-create-fzwm9\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.359789 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndgdn\" (UniqueName: \"kubernetes.io/projected/8523b7a8-45d6-4708-b1e7-4c3dbb505640-kube-api-access-ndgdn\") pod \"nova-cell0-a257-account-create-update-9t7cz\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.359811 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9cv5\" (UniqueName: \"kubernetes.io/projected/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-kube-api-access-l9cv5\") pod \"nova-cell1-db-create-fzwm9\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.361006 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-operator-scripts\") pod \"nova-cell1-db-create-fzwm9\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.375470 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.376468 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9cv5\" (UniqueName: \"kubernetes.io/projected/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-kube-api-access-l9cv5\") pod \"nova-cell1-db-create-fzwm9\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.399323 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.440609 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-68b5-account-create-update-zv5jr"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.442969 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.449120 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.465617 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8523b7a8-45d6-4708-b1e7-4c3dbb505640-operator-scripts\") pod \"nova-cell0-a257-account-create-update-9t7cz\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.465766 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndgdn\" (UniqueName: \"kubernetes.io/projected/8523b7a8-45d6-4708-b1e7-4c3dbb505640-kube-api-access-ndgdn\") pod \"nova-cell0-a257-account-create-update-9t7cz\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.468755 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8523b7a8-45d6-4708-b1e7-4c3dbb505640-operator-scripts\") pod \"nova-cell0-a257-account-create-update-9t7cz\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.469395 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.503808 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndgdn\" (UniqueName: \"kubernetes.io/projected/8523b7a8-45d6-4708-b1e7-4c3dbb505640-kube-api-access-ndgdn\") pod \"nova-cell0-a257-account-create-update-9t7cz\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.503925 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-68b5-account-create-update-zv5jr"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.547428 4687 generic.go:334] "Generic (PLEG): container finished" podID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerID="4656e89b07a7ea7b271db19827aae6edf0c11a8f0f6bf7fc313827ad223c9711" exitCode=0 Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.547467 4687 generic.go:334] "Generic (PLEG): container finished" podID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerID="7328c614e681d041cb5298aaf3c060ab486ec99e7e55c7aea3787f82a35a56a5" exitCode=2 Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.547477 4687 generic.go:334] "Generic (PLEG): container finished" podID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerID="3c5a37cb15025e2ea9da64019e231339514fd05d502893554d2668b276e265be" exitCode=0 Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.547503 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerDied","Data":"4656e89b07a7ea7b271db19827aae6edf0c11a8f0f6bf7fc313827ad223c9711"} Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.547555 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerDied","Data":"7328c614e681d041cb5298aaf3c060ab486ec99e7e55c7aea3787f82a35a56a5"} Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.547569 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerDied","Data":"3c5a37cb15025e2ea9da64019e231339514fd05d502893554d2668b276e265be"} Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.556227 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.569776 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz69l\" (UniqueName: \"kubernetes.io/projected/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-kube-api-access-kz69l\") pod \"nova-cell1-68b5-account-create-update-zv5jr\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.569820 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-operator-scripts\") pod \"nova-cell1-68b5-account-create-update-zv5jr\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.671542 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz69l\" (UniqueName: \"kubernetes.io/projected/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-kube-api-access-kz69l\") pod \"nova-cell1-68b5-account-create-update-zv5jr\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.671589 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-operator-scripts\") pod \"nova-cell1-68b5-account-create-update-zv5jr\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.672362 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-operator-scripts\") pod \"nova-cell1-68b5-account-create-update-zv5jr\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.687202 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz69l\" (UniqueName: \"kubernetes.io/projected/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-kube-api-access-kz69l\") pod \"nova-cell1-68b5-account-create-update-zv5jr\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.760086 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.760357 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-log" containerID="cri-o://18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c" gracePeriod=30 Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.760580 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-httpd" containerID="cri-o://b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460" gracePeriod=30 Feb 28 09:21:14 crc kubenswrapper[4687]: I0228 09:21:14.872709 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:15 crc kubenswrapper[4687]: I0228 09:21:15.559889 4687 generic.go:334] "Generic (PLEG): container finished" podID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerID="18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c" exitCode=143 Feb 28 09:21:15 crc kubenswrapper[4687]: I0228 09:21:15.559962 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfc7644-a187-4fe9-8067-fa474114c1a1","Type":"ContainerDied","Data":"18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c"} Feb 28 09:21:15 crc kubenswrapper[4687]: I0228 09:21:15.564169 4687 generic.go:334] "Generic (PLEG): container finished" podID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerID="4d6f76835d7dff86c5ab16f423bf593a4f1d175af0596304247ae50231fa48a5" exitCode=0 Feb 28 09:21:15 crc kubenswrapper[4687]: I0228 09:21:15.564225 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerDied","Data":"4d6f76835d7dff86c5ab16f423bf593a4f1d175af0596304247ae50231fa48a5"} Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.696878 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42eabfaf_28a5_4986_ad88_a93859225843.slice/crio-f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729.scope WatchSource:0}: Error finding container f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729: Status 404 returned error can't find the container with id f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729 Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.698546 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42eabfaf_28a5_4986_ad88_a93859225843.slice/crio-b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62.scope WatchSource:0}: Error finding container b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62: Status 404 returned error can't find the container with id b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62 Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.704010 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbad05ef2_b8b3_4844_a104_7bf24d1398b0.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbad05ef2_b8b3_4844_a104_7bf24d1398b0.slice: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.704089 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa88f1b2_477c_461c_a044_88fd35c31231.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa88f1b2_477c_461c_a044_88fd35c31231.slice: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.708623 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-conmon-a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-conmon-a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3.scope: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.708675 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-a1fed5d6d5d0aa69f698abb8377dbac8adffa03875f6933981857f1250afb4e3.scope: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.714665 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82a8aed_cc7b_4802_80f0_63e701ee0593.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82a8aed_cc7b_4802_80f0_63e701ee0593.slice: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.716873 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c76976_8cdb_45e0_826d_5d465de1829c.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c76976_8cdb_45e0_826d_5d465de1829c.slice: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: W0228 09:21:15.731781 4687 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53078962_6c8c_436e_8d57_e2ed7e9e2b6e.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53078962_6c8c_436e_8d57_e2ed7e9e2b6e.slice: no such file or directory Feb 28 09:21:15 crc kubenswrapper[4687]: E0228 09:21:15.869555 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e5e221e_73c7_44a2_9af9_0feb60b412e0.slice/crio-36d8793b5506960f0edd95fae453cc7431c4d82d7aee4458db381af12f245d6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice/crio-0a0a86b425d00964404e37a811d5b05c915d79a2bfeb451d659a2a38fec5dd2f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd655bdf4_33ab_45fa_b1e4_c37aede5609a.slice/crio-1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice/crio-conmon-61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a06887c_91c5_43bb_8631_53fac29e79b6.slice/crio-57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76f683cb_cc38_4cdd_a0f0_1077410b1768.slice/crio-conmon-1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76f683cb_cc38_4cdd_a0f0_1077410b1768.slice/crio-1c9ef7104fc110694f07caf4f711aeccd7d3058ee4396c11ab6e145a2805b318.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0859ec96_842c_472a_be1b_f29c8f1df2d9.slice/crio-8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455f8be2_a725_49fb_ba76_6f3e6c4cb34d.slice/crio-99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd655bdf4_33ab_45fa_b1e4_c37aede5609a.slice/crio-51b219e86f3b0d6b4919b070002226d15fce4b8fe16494e79bab096be1e39e20.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a06887c_91c5_43bb_8631_53fac29e79b6.slice/crio-conmon-57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice/crio-712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0859ec96_842c_472a_be1b_f29c8f1df2d9.slice/crio-conmon-8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice/crio-aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice/crio-9cb4ddb764f0c5a30b40d129e1c56024c04b6a19cb224b015cfc83c54194d2da\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice/crio-61b4a041e894cea9908f5c65adf16323390c21a79756431a89555ce4ae9d050a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a06887c_91c5_43bb_8631_53fac29e79b6.slice/crio-conmon-72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e5e221e_73c7_44a2_9af9_0feb60b412e0.slice/crio-2208c8fed1e88f5b5b0ba488bcffee0b225598f3f7537481f3ae92ba150a8d1d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-conmon-404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-404b8da225a564a9322c0d472094c80332802f0e803b8ac973b8bb4bfb07d4de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd655bdf4_33ab_45fa_b1e4_c37aede5609a.slice/crio-conmon-1f47f176744fd7232de0f9faea595a9e3333827c6923ad75f5f60d0995f4502e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76f683cb_cc38_4cdd_a0f0_1077410b1768.slice/crio-e8d5b812f2edc197ed1fa8b0a0914b0152c058afe4646764eeeefcfa6ffe9e43\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455f8be2_a725_49fb_ba76_6f3e6c4cb34d.slice/crio-conmon-99d7de2b7db74ea8113bfc0922f6805ecb6418596566a8ad9d8acf61d9569ffd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice/crio-b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-conmon-ec2211cc8159f7654685062ebd6bbc5d493f2f317474a1dfca1a6c26b052d1b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e5e221e_73c7_44a2_9af9_0feb60b412e0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e5e221e_73c7_44a2_9af9_0feb60b412e0.slice/crio-conmon-36d8793b5506960f0edd95fae453cc7431c4d82d7aee4458db381af12f245d6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c40408_c638_4bea_86d5_fb40a60b6975.slice/crio-conmon-b68cf1027bec3caa61756b0cafa9065fb6425e37e50d692bf7d2a9d913ffb111.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455f8be2_a725_49fb_ba76_6f3e6c4cb34d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfc7644_a187_4fe9_8067_fa474114c1a1.slice/crio-18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice/crio-conmon-aafeb892e6f15626514b11a0c74fd9d9c18cc477eec929ba61e66e431cb01d28.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a06887c_91c5_43bb_8631_53fac29e79b6.slice/crio-72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76f683cb_cc38_4cdd_a0f0_1077410b1768.slice/crio-f325690874bfb899167706dea38c4f57ef91836e19d44224b585c114ace4221d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42eabfaf_28a5_4986_ad88_a93859225843.slice/crio-conmon-b0bf191f33628fb62188c40a46bede2b789b37bdee9687877e8f5cdd31171f62.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfc7644_a187_4fe9_8067_fa474114c1a1.slice/crio-conmon-18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd655bdf4_33ab_45fa_b1e4_c37aede5609a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42eabfaf_28a5_4986_ad88_a93859225843.slice/crio-conmon-f7f614e24e6b8cbbf14ae24850ac1463ccbf43398ae08c6a403bea74d91d5729.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27799696_4eb6_4ef9_9440_151a3929d699.slice/crio-conmon-712de4921f163318aadd23457ab174bf0c4fb55adf335f7d52d76cf15375c37e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice/crio-b83edc249187f94706cb88fa7b442c63cc2c247afe76eefd355ca88641fe4c06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455f8be2_a725_49fb_ba76_6f3e6c4cb34d.slice/crio-9ba3383f945d7b2472026c92c72afaf80f70e31989b5540c8090bf0e0bff0dcd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0893a8_0386_4d6d_9476_c061c3fb5f3d.slice\": RecentStats: unable to find data in memory cache]" Feb 28 09:21:16 crc kubenswrapper[4687]: I0228 09:21:16.573729 4687 generic.go:334] "Generic (PLEG): container finished" podID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerID="72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761" exitCode=137 Feb 28 09:21:16 crc kubenswrapper[4687]: I0228 09:21:16.573810 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d58956cb6-f8plp" event={"ID":"6a06887c-91c5-43bb-8631-53fac29e79b6","Type":"ContainerDied","Data":"72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761"} Feb 28 09:21:16 crc kubenswrapper[4687]: I0228 09:21:16.576225 4687 generic.go:334] "Generic (PLEG): container finished" podID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerID="03244fa16f2b84c19f33830379f405964557d73d6134657ac158003cd9026866" exitCode=0 Feb 28 09:21:16 crc kubenswrapper[4687]: I0228 09:21:16.576259 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9c5b669b-xd8lz" event={"ID":"0859ec96-842c-472a-be1b-f29c8f1df2d9","Type":"ContainerDied","Data":"03244fa16f2b84c19f33830379f405964557d73d6134657ac158003cd9026866"} Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.195404 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.195717 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-httpd" containerID="cri-o://8bd1539f05f84dff93650ce81fe1fb27a301643199250c07815c3f641b7b68d3" gracePeriod=30 Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.195655 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-log" containerID="cri-o://028844f2d4127d97e4dcbbf0a6c2f4aa6f538feb591e1cd7ad283e048ad0153f" gracePeriod=30 Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.597978 4687 generic.go:334] "Generic (PLEG): container finished" podID="59ec19ad-b746-417b-a573-1b450746e794" containerID="028844f2d4127d97e4dcbbf0a6c2f4aa6f538feb591e1cd7ad283e048ad0153f" exitCode=143 Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.598140 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59ec19ad-b746-417b-a573-1b450746e794","Type":"ContainerDied","Data":"028844f2d4127d97e4dcbbf0a6c2f4aa6f538feb591e1cd7ad283e048ad0153f"} Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.900748 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.926582 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:21:17 crc kubenswrapper[4687]: I0228 09:21:17.970634 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044398 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06887c-91c5-43bb-8631-53fac29e79b6-logs\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044717 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv5ph\" (UniqueName: \"kubernetes.io/projected/0859ec96-842c-472a-be1b-f29c8f1df2d9-kube-api-access-qv5ph\") pod \"0859ec96-842c-472a-be1b-f29c8f1df2d9\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044785 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-secret-key\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044818 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-combined-ca-bundle\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044835 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlkkh\" (UniqueName: \"kubernetes.io/projected/55473b1d-46e6-4ee3-953b-36013758c6e8-kube-api-access-zlkkh\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044869 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-ovndb-tls-certs\") pod \"0859ec96-842c-472a-be1b-f29c8f1df2d9\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044943 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a06887c-91c5-43bb-8631-53fac29e79b6-logs" (OuterVolumeSpecName: "logs") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.044976 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-tls-certs\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045050 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-config-data\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045094 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-config-data\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045137 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-httpd-config\") pod \"0859ec96-842c-472a-be1b-f29c8f1df2d9\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045157 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-config\") pod \"0859ec96-842c-472a-be1b-f29c8f1df2d9\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045245 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-log-httpd\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045267 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n24bp\" (UniqueName: \"kubernetes.io/projected/6a06887c-91c5-43bb-8631-53fac29e79b6-kube-api-access-n24bp\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045292 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-run-httpd\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045329 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-sg-core-conf-yaml\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045387 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-combined-ca-bundle\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045414 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-combined-ca-bundle\") pod \"0859ec96-842c-472a-be1b-f29c8f1df2d9\" (UID: \"0859ec96-842c-472a-be1b-f29c8f1df2d9\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045454 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-scripts\") pod \"6a06887c-91c5-43bb-8631-53fac29e79b6\" (UID: \"6a06887c-91c5-43bb-8631-53fac29e79b6\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045479 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-scripts\") pod \"55473b1d-46e6-4ee3-953b-36013758c6e8\" (UID: \"55473b1d-46e6-4ee3-953b-36013758c6e8\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045913 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06887c-91c5-43bb-8631-53fac29e79b6-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.045958 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.047903 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.051751 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.054208 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0859ec96-842c-472a-be1b-f29c8f1df2d9-kube-api-access-qv5ph" (OuterVolumeSpecName: "kube-api-access-qv5ph") pod "0859ec96-842c-472a-be1b-f29c8f1df2d9" (UID: "0859ec96-842c-472a-be1b-f29c8f1df2d9"). InnerVolumeSpecName "kube-api-access-qv5ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.054552 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-scripts" (OuterVolumeSpecName: "scripts") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.055930 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a06887c-91c5-43bb-8631-53fac29e79b6-kube-api-access-n24bp" (OuterVolumeSpecName: "kube-api-access-n24bp") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "kube-api-access-n24bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.065317 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55473b1d-46e6-4ee3-953b-36013758c6e8-kube-api-access-zlkkh" (OuterVolumeSpecName: "kube-api-access-zlkkh") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "kube-api-access-zlkkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.065416 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0859ec96-842c-472a-be1b-f29c8f1df2d9" (UID: "0859ec96-842c-472a-be1b-f29c8f1df2d9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.069810 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-scripts" (OuterVolumeSpecName: "scripts") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.085142 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.097046 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-config-data" (OuterVolumeSpecName: "config-data") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.104892 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.109474 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a06887c-91c5-43bb-8631-53fac29e79b6" (UID: "6a06887c-91c5-43bb-8631-53fac29e79b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.119551 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-config" (OuterVolumeSpecName: "config") pod "0859ec96-842c-472a-be1b-f29c8f1df2d9" (UID: "0859ec96-842c-472a-be1b-f29c8f1df2d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.128268 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0859ec96-842c-472a-be1b-f29c8f1df2d9" (UID: "0859ec96-842c-472a-be1b-f29c8f1df2d9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.142364 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0859ec96-842c-472a-be1b-f29c8f1df2d9" (UID: "0859ec96-842c-472a-be1b-f29c8f1df2d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149010 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149057 4687 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149076 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149089 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149101 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n24bp\" (UniqueName: \"kubernetes.io/projected/6a06887c-91c5-43bb-8631-53fac29e79b6-kube-api-access-n24bp\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149114 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55473b1d-46e6-4ee3-953b-36013758c6e8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149123 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149137 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149145 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149155 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a06887c-91c5-43bb-8631-53fac29e79b6-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149164 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149174 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv5ph\" (UniqueName: \"kubernetes.io/projected/0859ec96-842c-472a-be1b-f29c8f1df2d9-kube-api-access-qv5ph\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149185 4687 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149196 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlkkh\" (UniqueName: \"kubernetes.io/projected/55473b1d-46e6-4ee3-953b-36013758c6e8-kube-api-access-zlkkh\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149206 4687 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0859ec96-842c-472a-be1b-f29c8f1df2d9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.149218 4687 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06887c-91c5-43bb-8631-53fac29e79b6-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.157713 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.174374 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-config-data" (OuterVolumeSpecName: "config-data") pod "55473b1d-46e6-4ee3-953b-36013758c6e8" (UID: "55473b1d-46e6-4ee3-953b-36013758c6e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.250354 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.250385 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55473b1d-46e6-4ee3-953b-36013758c6e8-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.288486 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a257-account-create-update-9t7cz"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.476186 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fzwm9"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.495516 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-86rbf"] Feb 28 09:21:18 crc kubenswrapper[4687]: W0228 09:21:18.498488 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5777779d_582f_4e60_ac7f_e194408c31eb.slice/crio-b90ae8989f975b805d3460ef6838d4b4ce41cbd537a13eba55f7bf3e0a0f7d2a WatchSource:0}: Error finding container b90ae8989f975b805d3460ef6838d4b4ce41cbd537a13eba55f7bf3e0a0f7d2a: Status 404 returned error can't find the container with id b90ae8989f975b805d3460ef6838d4b4ce41cbd537a13eba55f7bf3e0a0f7d2a Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.506110 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6qwj6"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.512302 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-68b5-account-create-update-zv5jr"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.516508 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d297-account-create-update-688sh"] Feb 28 09:21:18 crc kubenswrapper[4687]: W0228 09:21:18.516769 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9159f256_61e8_41bc_bceb_d602b568ef60.slice/crio-6214490f17418c95afff3c41658eef3c4691532824330be95a2a3b366498f63b WatchSource:0}: Error finding container 6214490f17418c95afff3c41658eef3c4691532824330be95a2a3b366498f63b: Status 404 returned error can't find the container with id 6214490f17418c95afff3c41658eef3c4691532824330be95a2a3b366498f63b Feb 28 09:21:18 crc kubenswrapper[4687]: W0228 09:21:18.520593 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod131e7bdc_bd19_4a7e_b0ad_a561c7f3a857.slice/crio-299cc218229aa755c91033efa47ef70b296085cd6279c72f9225a3312a4c2799 WatchSource:0}: Error finding container 299cc218229aa755c91033efa47ef70b296085cd6279c72f9225a3312a4c2799: Status 404 returned error can't find the container with id 299cc218229aa755c91033efa47ef70b296085cd6279c72f9225a3312a4c2799 Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.627327 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fzwm9" event={"ID":"dc1143c7-db81-4638-ad50-a1d7d26d9ad7","Type":"ContainerStarted","Data":"0ca3dfbcb0ad83bac58c4bda08d84f878fddd43344ebc1f8c9cd104d9757d20f"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.635311 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d58956cb6-f8plp" event={"ID":"6a06887c-91c5-43bb-8631-53fac29e79b6","Type":"ContainerDied","Data":"73128560e01e97d3de44cb4de5cead387621b152131260aab0f010990e438d7d"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.635358 4687 scope.go:117] "RemoveContainer" containerID="57eba8c8848cfdc58b9d231bc4a845a3aef1d76384a7fc2e2fb3b3a4dcffe324" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.635458 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d58956cb6-f8plp" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.645027 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-86rbf" event={"ID":"5777779d-582f-4e60-ac7f-e194408c31eb","Type":"ContainerStarted","Data":"b90ae8989f975b805d3460ef6838d4b4ce41cbd537a13eba55f7bf3e0a0f7d2a"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.649085 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.652563 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39","Type":"ContainerStarted","Data":"1edbbe838336ab030ab00b63596504f481e0adfde5d8f6388faa83b126148f43"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.653746 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" event={"ID":"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857","Type":"ContainerStarted","Data":"299cc218229aa755c91033efa47ef70b296085cd6279c72f9225a3312a4c2799"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.655594 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9c5b669b-xd8lz" event={"ID":"0859ec96-842c-472a-be1b-f29c8f1df2d9","Type":"ContainerDied","Data":"b95d74220f77d6ba675759f720999213e1d9f6762c8feec6b655b77a35bd9d13"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.655659 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9c5b669b-xd8lz" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.705879 4687 generic.go:334] "Generic (PLEG): container finished" podID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerID="b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460" exitCode=0 Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.705968 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.721631 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745618 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d297-account-create-update-688sh" event={"ID":"fcaf528f-cd30-4024-b73f-da1ac741ee53","Type":"ContainerStarted","Data":"104ef930d7ec64d156803b37c0833b8445a5b4c8d75225eab2373345eddd9bee"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745653 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfc7644-a187-4fe9-8067-fa474114c1a1","Type":"ContainerDied","Data":"b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745678 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfc7644-a187-4fe9-8067-fa474114c1a1","Type":"ContainerDied","Data":"14390a5d012080e61bd12d53a120b0b67ebd8ff5a0deabc8deb56bd16cd47266"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745691 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" event={"ID":"8523b7a8-45d6-4708-b1e7-4c3dbb505640","Type":"ContainerStarted","Data":"879a5a4f6bce04b8d0f622b16c280ec2579a68147499ef39475b3805b16f0c0d"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745704 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" event={"ID":"8523b7a8-45d6-4708-b1e7-4c3dbb505640","Type":"ContainerStarted","Data":"b1ca34c0ed46a7804b81140e134de65df9df20ef150602bb71f5fd73cd4e52eb"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745712 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55473b1d-46e6-4ee3-953b-36013758c6e8","Type":"ContainerDied","Data":"1c895ced0e7308a7f04bf915a8cb013d135a2dbc10eec3bfdfdc15245c0df893"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.745724 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qwj6" event={"ID":"9159f256-61e8-41bc-bceb-d602b568ef60","Type":"ContainerStarted","Data":"6214490f17418c95afff3c41658eef3c4691532824330be95a2a3b366498f63b"} Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.761701 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-scripts\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.761772 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.761887 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-public-tls-certs\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.761953 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-httpd-run\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.762060 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-combined-ca-bundle\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.762115 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-config-data\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.762152 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-logs\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.762167 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9cwm\" (UniqueName: \"kubernetes.io/projected/cdfc7644-a187-4fe9-8067-fa474114c1a1-kube-api-access-b9cwm\") pod \"cdfc7644-a187-4fe9-8067-fa474114c1a1\" (UID: \"cdfc7644-a187-4fe9-8067-fa474114c1a1\") " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.767689 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-logs" (OuterVolumeSpecName: "logs") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.767853 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.768950 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-scripts" (OuterVolumeSpecName: "scripts") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.782785 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.783404 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfc7644-a187-4fe9-8067-fa474114c1a1-kube-api-access-b9cwm" (OuterVolumeSpecName: "kube-api-access-b9cwm") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "kube-api-access-b9cwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.813643 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.7387973629999998 podStartE2EDuration="14.813625072s" podCreationTimestamp="2026-02-28 09:21:04 +0000 UTC" firstStartedPulling="2026-02-28 09:21:05.580659565 +0000 UTC m=+1057.271228902" lastFinishedPulling="2026-02-28 09:21:17.655487274 +0000 UTC m=+1069.346056611" observedRunningTime="2026-02-28 09:21:18.794094466 +0000 UTC m=+1070.484663803" watchObservedRunningTime="2026-02-28 09:21:18.813625072 +0000 UTC m=+1070.504194409" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.817418 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.831519 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-config-data" (OuterVolumeSpecName: "config-data") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866598 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866685 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866719 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866728 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9cwm\" (UniqueName: \"kubernetes.io/projected/cdfc7644-a187-4fe9-8067-fa474114c1a1-kube-api-access-b9cwm\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866738 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866767 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.866799 4687 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfc7644-a187-4fe9-8067-fa474114c1a1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.888716 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cdfc7644-a187-4fe9-8067-fa474114c1a1" (UID: "cdfc7644-a187-4fe9-8067-fa474114c1a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.898751 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b9c5b669b-xd8lz"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.925096 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b9c5b669b-xd8lz"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.925306 4687 scope.go:117] "RemoveContainer" containerID="72f5b1d21b2565af1ff09d9cba487ca40b4971d91a32230255a8e098ffc62761" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.938272 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d58956cb6-f8plp"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.947674 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d58956cb6-f8plp"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.951526 4687 scope.go:117] "RemoveContainer" containerID="8c0f0bab64ff709f237761dab2e575643a7140ce428e1242ca10ffd15bd720ce" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.955679 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.962607 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.969913 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970496 4687 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfc7644-a187-4fe9-8067-fa474114c1a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970662 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-api" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970682 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-api" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970694 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970701 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970712 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon-log" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970718 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon-log" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970728 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-central-agent" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970736 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-central-agent" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970745 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="proxy-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970752 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="proxy-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970764 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-log" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970770 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-log" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970785 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970792 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970801 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-notification-agent" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970809 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-notification-agent" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970830 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970836 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: E0228 09:21:18.970844 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="sg-core" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.970849 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="sg-core" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971133 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-notification-agent" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971146 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971159 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="ceilometer-central-agent" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971167 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" containerName="glance-log" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971177 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="sg-core" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971186 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" containerName="proxy-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971197 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971207 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" containerName="horizon-log" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971216 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-httpd" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.971232 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" containerName="neutron-api" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.972042 4687 scope.go:117] "RemoveContainer" containerID="03244fa16f2b84c19f33830379f405964557d73d6134657ac158003cd9026866" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.973399 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.973483 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.976259 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.976740 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.986083 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 28 09:21:18 crc kubenswrapper[4687]: I0228 09:21:18.998627 4687 scope.go:117] "RemoveContainer" containerID="b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.030156 4687 scope.go:117] "RemoveContainer" containerID="18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.073817 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.073870 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-config-data\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.073904 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-run-httpd\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.073951 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.073967 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-log-httpd\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.073994 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-scripts\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.074160 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45jw\" (UniqueName: \"kubernetes.io/projected/7afe8326-fb09-4f40-8a96-b517fe4fad97-kube-api-access-c45jw\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.074221 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.078567 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.095988 4687 scope.go:117] "RemoveContainer" containerID="b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.106295 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:21:19 crc kubenswrapper[4687]: E0228 09:21:19.106972 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460\": container with ID starting with b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460 not found: ID does not exist" containerID="b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.107121 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460"} err="failed to get container status \"b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460\": rpc error: code = NotFound desc = could not find container \"b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460\": container with ID starting with b2671320ae659644d88f9255139ef23295ecac63a870898b1adfa50fddbad460 not found: ID does not exist" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.107155 4687 scope.go:117] "RemoveContainer" containerID="18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c" Feb 28 09:21:19 crc kubenswrapper[4687]: E0228 09:21:19.111195 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c\": container with ID starting with 18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c not found: ID does not exist" containerID="18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.111231 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c"} err="failed to get container status \"18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c\": rpc error: code = NotFound desc = could not find container \"18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c\": container with ID starting with 18848427d0dbbc8a8ada0f9975ef90eeed3cc2e0c27b19992c9f3cf0afc1647c not found: ID does not exist" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.111261 4687 scope.go:117] "RemoveContainer" containerID="4656e89b07a7ea7b271db19827aae6edf0c11a8f0f6bf7fc313827ad223c9711" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.136144 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.137872 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.144772 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.144949 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.145277 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.149896 4687 scope.go:117] "RemoveContainer" containerID="7328c614e681d041cb5298aaf3c060ab486ec99e7e55c7aea3787f82a35a56a5" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.174873 4687 scope.go:117] "RemoveContainer" containerID="3c5a37cb15025e2ea9da64019e231339514fd05d502893554d2668b276e265be" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.175930 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.175995 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-log-httpd\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.176071 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-scripts\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.176730 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-log-httpd\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.177078 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45jw\" (UniqueName: \"kubernetes.io/projected/7afe8326-fb09-4f40-8a96-b517fe4fad97-kube-api-access-c45jw\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.177204 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.177242 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-config-data\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.177282 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-run-httpd\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.177636 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-run-httpd\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.180628 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.180919 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-scripts\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.186951 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-config-data\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.187536 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.200615 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45jw\" (UniqueName: \"kubernetes.io/projected/7afe8326-fb09-4f40-8a96-b517fe4fad97-kube-api-access-c45jw\") pod \"ceilometer-0\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.222083 4687 scope.go:117] "RemoveContainer" containerID="4d6f76835d7dff86c5ab16f423bf593a4f1d175af0596304247ae50231fa48a5" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.279699 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.279792 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.279932 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7927ff-9e46-45c4-8f30-f55742dda755-logs\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.280147 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9vc8\" (UniqueName: \"kubernetes.io/projected/df7927ff-9e46-45c4-8f30-f55742dda755-kube-api-access-m9vc8\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.280193 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-config-data\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.280277 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-scripts\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.280352 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df7927ff-9e46-45c4-8f30-f55742dda755-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.280388 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.295592 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.382597 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9vc8\" (UniqueName: \"kubernetes.io/projected/df7927ff-9e46-45c4-8f30-f55742dda755-kube-api-access-m9vc8\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.382887 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-config-data\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.382943 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-scripts\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.383005 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df7927ff-9e46-45c4-8f30-f55742dda755-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.383043 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.383117 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.383187 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.383231 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7927ff-9e46-45c4-8f30-f55742dda755-logs\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.383693 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df7927ff-9e46-45c4-8f30-f55742dda755-logs\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.384722 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.384944 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df7927ff-9e46-45c4-8f30-f55742dda755-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.390461 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-scripts\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.391689 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.392526 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.397284 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7927ff-9e46-45c4-8f30-f55742dda755-config-data\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.404920 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9vc8\" (UniqueName: \"kubernetes.io/projected/df7927ff-9e46-45c4-8f30-f55742dda755-kube-api-access-m9vc8\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.419656 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df7927ff-9e46-45c4-8f30-f55742dda755\") " pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.464791 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.736504 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.760232 4687 generic.go:334] "Generic (PLEG): container finished" podID="fcaf528f-cd30-4024-b73f-da1ac741ee53" containerID="3a89bc6486a14484de7179d420bf770c0d6e6f262f92b3f6fe6bfaee21fd64a8" exitCode=0 Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.760298 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d297-account-create-update-688sh" event={"ID":"fcaf528f-cd30-4024-b73f-da1ac741ee53","Type":"ContainerDied","Data":"3a89bc6486a14484de7179d420bf770c0d6e6f262f92b3f6fe6bfaee21fd64a8"} Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.762002 4687 generic.go:334] "Generic (PLEG): container finished" podID="8523b7a8-45d6-4708-b1e7-4c3dbb505640" containerID="879a5a4f6bce04b8d0f622b16c280ec2579a68147499ef39475b3805b16f0c0d" exitCode=0 Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.762079 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" event={"ID":"8523b7a8-45d6-4708-b1e7-4c3dbb505640","Type":"ContainerDied","Data":"879a5a4f6bce04b8d0f622b16c280ec2579a68147499ef39475b3805b16f0c0d"} Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.770917 4687 generic.go:334] "Generic (PLEG): container finished" podID="9159f256-61e8-41bc-bceb-d602b568ef60" containerID="9b5fd43c0e428bad6cdca44f3ec6b58781eea24cec60383f60b95a07397f8736" exitCode=0 Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.770983 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qwj6" event={"ID":"9159f256-61e8-41bc-bceb-d602b568ef60","Type":"ContainerDied","Data":"9b5fd43c0e428bad6cdca44f3ec6b58781eea24cec60383f60b95a07397f8736"} Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.777134 4687 generic.go:334] "Generic (PLEG): container finished" podID="dc1143c7-db81-4638-ad50-a1d7d26d9ad7" containerID="b64dea1d776a999a2a152ba3b9c5b54f53b61d67f6df9c676d2569c3c14be455" exitCode=0 Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.777216 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fzwm9" event={"ID":"dc1143c7-db81-4638-ad50-a1d7d26d9ad7","Type":"ContainerDied","Data":"b64dea1d776a999a2a152ba3b9c5b54f53b61d67f6df9c676d2569c3c14be455"} Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.782329 4687 generic.go:334] "Generic (PLEG): container finished" podID="5777779d-582f-4e60-ac7f-e194408c31eb" containerID="c24080e76861b55cba45b319deac146d73a4f68b51069d4b4ce6a2e35a9bc587" exitCode=0 Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.782388 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-86rbf" event={"ID":"5777779d-582f-4e60-ac7f-e194408c31eb","Type":"ContainerDied","Data":"c24080e76861b55cba45b319deac146d73a4f68b51069d4b4ce6a2e35a9bc587"} Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.790613 4687 generic.go:334] "Generic (PLEG): container finished" podID="131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" containerID="15b1e63e4e21a967e9268dfa26574c0b6305a6b5de1a1d1cc14160fdab783c24" exitCode=0 Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.790921 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" event={"ID":"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857","Type":"ContainerDied","Data":"15b1e63e4e21a967e9268dfa26574c0b6305a6b5de1a1d1cc14160fdab783c24"} Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.905775 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:19 crc kubenswrapper[4687]: I0228 09:21:19.924896 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.023660 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.102353 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8523b7a8-45d6-4708-b1e7-4c3dbb505640-operator-scripts\") pod \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.102451 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndgdn\" (UniqueName: \"kubernetes.io/projected/8523b7a8-45d6-4708-b1e7-4c3dbb505640-kube-api-access-ndgdn\") pod \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\" (UID: \"8523b7a8-45d6-4708-b1e7-4c3dbb505640\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.105243 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8523b7a8-45d6-4708-b1e7-4c3dbb505640-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8523b7a8-45d6-4708-b1e7-4c3dbb505640" (UID: "8523b7a8-45d6-4708-b1e7-4c3dbb505640"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.108956 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8523b7a8-45d6-4708-b1e7-4c3dbb505640-kube-api-access-ndgdn" (OuterVolumeSpecName: "kube-api-access-ndgdn") pod "8523b7a8-45d6-4708-b1e7-4c3dbb505640" (UID: "8523b7a8-45d6-4708-b1e7-4c3dbb505640"). InnerVolumeSpecName "kube-api-access-ndgdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.205554 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8523b7a8-45d6-4708-b1e7-4c3dbb505640-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.205879 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndgdn\" (UniqueName: \"kubernetes.io/projected/8523b7a8-45d6-4708-b1e7-4c3dbb505640-kube-api-access-ndgdn\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.680002 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0859ec96-842c-472a-be1b-f29c8f1df2d9" path="/var/lib/kubelet/pods/0859ec96-842c-472a-be1b-f29c8f1df2d9/volumes" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.680876 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55473b1d-46e6-4ee3-953b-36013758c6e8" path="/var/lib/kubelet/pods/55473b1d-46e6-4ee3-953b-36013758c6e8/volumes" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.686039 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a06887c-91c5-43bb-8631-53fac29e79b6" path="/var/lib/kubelet/pods/6a06887c-91c5-43bb-8631-53fac29e79b6/volumes" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.686669 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfc7644-a187-4fe9-8067-fa474114c1a1" path="/var/lib/kubelet/pods/cdfc7644-a187-4fe9-8067-fa474114c1a1/volumes" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.808586 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.808588 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a257-account-create-update-9t7cz" event={"ID":"8523b7a8-45d6-4708-b1e7-4c3dbb505640","Type":"ContainerDied","Data":"b1ca34c0ed46a7804b81140e134de65df9df20ef150602bb71f5fd73cd4e52eb"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.808710 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1ca34c0ed46a7804b81140e134de65df9df20ef150602bb71f5fd73cd4e52eb" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.810809 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerStarted","Data":"41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.810842 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerStarted","Data":"a893cd90635b3c008d9736a2edb840c6d2d349b9ea16aaa003e2e927f02871d5"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.816780 4687 generic.go:334] "Generic (PLEG): container finished" podID="59ec19ad-b746-417b-a573-1b450746e794" containerID="8bd1539f05f84dff93650ce81fe1fb27a301643199250c07815c3f641b7b68d3" exitCode=0 Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.816822 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59ec19ad-b746-417b-a573-1b450746e794","Type":"ContainerDied","Data":"8bd1539f05f84dff93650ce81fe1fb27a301643199250c07815c3f641b7b68d3"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.816867 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59ec19ad-b746-417b-a573-1b450746e794","Type":"ContainerDied","Data":"8006049fb60c346daf716735c999f799a9d932c3d6ca58c1ef84b3b4687ca796"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.816881 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8006049fb60c346daf716735c999f799a9d932c3d6ca58c1ef84b3b4687ca796" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.819595 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df7927ff-9e46-45c4-8f30-f55742dda755","Type":"ContainerStarted","Data":"05131adc1b0579d6c792f1d6a2706b6e3824eacb7b931b1ed298ee7054738f67"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.819640 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df7927ff-9e46-45c4-8f30-f55742dda755","Type":"ContainerStarted","Data":"4f5847b373e1696717ec9f849cfa041d54bb8ce892a7a29c5bdd2e10da5a8e6e"} Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.833268 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936508 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqlpt\" (UniqueName: \"kubernetes.io/projected/59ec19ad-b746-417b-a573-1b450746e794-kube-api-access-tqlpt\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936603 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-scripts\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936691 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-internal-tls-certs\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936784 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-httpd-run\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936807 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936824 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-logs\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936843 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-combined-ca-bundle\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.936872 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.941857 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ec19ad-b746-417b-a573-1b450746e794-kube-api-access-tqlpt" (OuterVolumeSpecName: "kube-api-access-tqlpt") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "kube-api-access-tqlpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.944381 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.945278 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-logs" (OuterVolumeSpecName: "logs") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.945769 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.948160 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-scripts" (OuterVolumeSpecName: "scripts") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:20 crc kubenswrapper[4687]: I0228 09:21:20.970515 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.012494 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.039133 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data" (OuterVolumeSpecName: "config-data") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.041236 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data\") pod \"59ec19ad-b746-417b-a573-1b450746e794\" (UID: \"59ec19ad-b746-417b-a573-1b450746e794\") " Feb 28 09:21:21 crc kubenswrapper[4687]: W0228 09:21:21.042038 4687 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/59ec19ad-b746-417b-a573-1b450746e794/volumes/kubernetes.io~secret/config-data Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042057 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data" (OuterVolumeSpecName: "config-data") pod "59ec19ad-b746-417b-a573-1b450746e794" (UID: "59ec19ad-b746-417b-a573-1b450746e794"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042462 4687 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042489 4687 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042518 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042529 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ec19ad-b746-417b-a573-1b450746e794-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042538 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042550 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042558 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqlpt\" (UniqueName: \"kubernetes.io/projected/59ec19ad-b746-417b-a573-1b450746e794-kube-api-access-tqlpt\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.042568 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59ec19ad-b746-417b-a573-1b450746e794-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.068607 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.144955 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.199424 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.245789 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg8wg\" (UniqueName: \"kubernetes.io/projected/5777779d-582f-4e60-ac7f-e194408c31eb-kube-api-access-hg8wg\") pod \"5777779d-582f-4e60-ac7f-e194408c31eb\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.246096 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5777779d-582f-4e60-ac7f-e194408c31eb-operator-scripts\") pod \"5777779d-582f-4e60-ac7f-e194408c31eb\" (UID: \"5777779d-582f-4e60-ac7f-e194408c31eb\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.246945 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5777779d-582f-4e60-ac7f-e194408c31eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5777779d-582f-4e60-ac7f-e194408c31eb" (UID: "5777779d-582f-4e60-ac7f-e194408c31eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.256603 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5777779d-582f-4e60-ac7f-e194408c31eb-kube-api-access-hg8wg" (OuterVolumeSpecName: "kube-api-access-hg8wg") pod "5777779d-582f-4e60-ac7f-e194408c31eb" (UID: "5777779d-582f-4e60-ac7f-e194408c31eb"). InnerVolumeSpecName "kube-api-access-hg8wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.348827 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5777779d-582f-4e60-ac7f-e194408c31eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.349096 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg8wg\" (UniqueName: \"kubernetes.io/projected/5777779d-582f-4e60-ac7f-e194408c31eb-kube-api-access-hg8wg\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.359798 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.381317 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.407278 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.436391 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.453511 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159f256-61e8-41bc-bceb-d602b568ef60-operator-scripts\") pod \"9159f256-61e8-41bc-bceb-d602b568ef60\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.453615 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcaf528f-cd30-4024-b73f-da1ac741ee53-operator-scripts\") pod \"fcaf528f-cd30-4024-b73f-da1ac741ee53\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.453753 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-operator-scripts\") pod \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.453852 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz69l\" (UniqueName: \"kubernetes.io/projected/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-kube-api-access-kz69l\") pod \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\" (UID: \"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.453875 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6kkd\" (UniqueName: \"kubernetes.io/projected/fcaf528f-cd30-4024-b73f-da1ac741ee53-kube-api-access-r6kkd\") pod \"fcaf528f-cd30-4024-b73f-da1ac741ee53\" (UID: \"fcaf528f-cd30-4024-b73f-da1ac741ee53\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.453980 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb848\" (UniqueName: \"kubernetes.io/projected/9159f256-61e8-41bc-bceb-d602b568ef60-kube-api-access-pb848\") pod \"9159f256-61e8-41bc-bceb-d602b568ef60\" (UID: \"9159f256-61e8-41bc-bceb-d602b568ef60\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.455937 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9159f256-61e8-41bc-bceb-d602b568ef60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9159f256-61e8-41bc-bceb-d602b568ef60" (UID: "9159f256-61e8-41bc-bceb-d602b568ef60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.456578 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" (UID: "131e7bdc-bd19-4a7e-b0ad-a561c7f3a857"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.456830 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaf528f-cd30-4024-b73f-da1ac741ee53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcaf528f-cd30-4024-b73f-da1ac741ee53" (UID: "fcaf528f-cd30-4024-b73f-da1ac741ee53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.468493 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9159f256-61e8-41bc-bceb-d602b568ef60-kube-api-access-pb848" (OuterVolumeSpecName: "kube-api-access-pb848") pod "9159f256-61e8-41bc-bceb-d602b568ef60" (UID: "9159f256-61e8-41bc-bceb-d602b568ef60"). InnerVolumeSpecName "kube-api-access-pb848". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.472334 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-kube-api-access-kz69l" (OuterVolumeSpecName: "kube-api-access-kz69l") pod "131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" (UID: "131e7bdc-bd19-4a7e-b0ad-a561c7f3a857"). InnerVolumeSpecName "kube-api-access-kz69l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.484507 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcaf528f-cd30-4024-b73f-da1ac741ee53-kube-api-access-r6kkd" (OuterVolumeSpecName: "kube-api-access-r6kkd") pod "fcaf528f-cd30-4024-b73f-da1ac741ee53" (UID: "fcaf528f-cd30-4024-b73f-da1ac741ee53"). InnerVolumeSpecName "kube-api-access-r6kkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.556568 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-operator-scripts\") pod \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.556650 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9cv5\" (UniqueName: \"kubernetes.io/projected/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-kube-api-access-l9cv5\") pod \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\" (UID: \"dc1143c7-db81-4638-ad50-a1d7d26d9ad7\") " Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557158 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz69l\" (UniqueName: \"kubernetes.io/projected/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-kube-api-access-kz69l\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557176 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6kkd\" (UniqueName: \"kubernetes.io/projected/fcaf528f-cd30-4024-b73f-da1ac741ee53-kube-api-access-r6kkd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557167 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc1143c7-db81-4638-ad50-a1d7d26d9ad7" (UID: "dc1143c7-db81-4638-ad50-a1d7d26d9ad7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557187 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb848\" (UniqueName: \"kubernetes.io/projected/9159f256-61e8-41bc-bceb-d602b568ef60-kube-api-access-pb848\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557256 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159f256-61e8-41bc-bceb-d602b568ef60-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557270 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcaf528f-cd30-4024-b73f-da1ac741ee53-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.557281 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.560398 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-kube-api-access-l9cv5" (OuterVolumeSpecName: "kube-api-access-l9cv5") pod "dc1143c7-db81-4638-ad50-a1d7d26d9ad7" (UID: "dc1143c7-db81-4638-ad50-a1d7d26d9ad7"). InnerVolumeSpecName "kube-api-access-l9cv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.658529 4687 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.658564 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9cv5\" (UniqueName: \"kubernetes.io/projected/dc1143c7-db81-4638-ad50-a1d7d26d9ad7-kube-api-access-l9cv5\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.859360 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6qwj6" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.859550 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6qwj6" event={"ID":"9159f256-61e8-41bc-bceb-d602b568ef60","Type":"ContainerDied","Data":"6214490f17418c95afff3c41658eef3c4691532824330be95a2a3b366498f63b"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.859614 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6214490f17418c95afff3c41658eef3c4691532824330be95a2a3b366498f63b" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.864899 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df7927ff-9e46-45c4-8f30-f55742dda755","Type":"ContainerStarted","Data":"b6683654d8fb7085a1fb708b4a496662255d1bab1df68541a28927b0e0cdfa8f"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.871184 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fzwm9" event={"ID":"dc1143c7-db81-4638-ad50-a1d7d26d9ad7","Type":"ContainerDied","Data":"0ca3dfbcb0ad83bac58c4bda08d84f878fddd43344ebc1f8c9cd104d9757d20f"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.871228 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca3dfbcb0ad83bac58c4bda08d84f878fddd43344ebc1f8c9cd104d9757d20f" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.871278 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fzwm9" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.874528 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d297-account-create-update-688sh" event={"ID":"fcaf528f-cd30-4024-b73f-da1ac741ee53","Type":"ContainerDied","Data":"104ef930d7ec64d156803b37c0833b8445a5b4c8d75225eab2373345eddd9bee"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.874574 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="104ef930d7ec64d156803b37c0833b8445a5b4c8d75225eab2373345eddd9bee" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.874623 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d297-account-create-update-688sh" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.878535 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-86rbf" event={"ID":"5777779d-582f-4e60-ac7f-e194408c31eb","Type":"ContainerDied","Data":"b90ae8989f975b805d3460ef6838d4b4ce41cbd537a13eba55f7bf3e0a0f7d2a"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.878563 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90ae8989f975b805d3460ef6838d4b4ce41cbd537a13eba55f7bf3e0a0f7d2a" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.878613 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86rbf" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.880869 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" event={"ID":"131e7bdc-bd19-4a7e-b0ad-a561c7f3a857","Type":"ContainerDied","Data":"299cc218229aa755c91033efa47ef70b296085cd6279c72f9225a3312a4c2799"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.880915 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299cc218229aa755c91033efa47ef70b296085cd6279c72f9225a3312a4c2799" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.881015 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-68b5-account-create-update-zv5jr" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.893858 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.895122 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerStarted","Data":"557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855"} Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.898143 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.898127298 podStartE2EDuration="2.898127298s" podCreationTimestamp="2026-02-28 09:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:21.884487299 +0000 UTC m=+1073.575056646" watchObservedRunningTime="2026-02-28 09:21:21.898127298 +0000 UTC m=+1073.588696636" Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.966649 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:21:21 crc kubenswrapper[4687]: I0228 09:21:21.976965 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:21.999792 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.000878 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcaf528f-cd30-4024-b73f-da1ac741ee53" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.000916 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcaf528f-cd30-4024-b73f-da1ac741ee53" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.000964 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.000971 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.000979 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1143c7-db81-4638-ad50-a1d7d26d9ad7" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.000985 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1143c7-db81-4638-ad50-a1d7d26d9ad7" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.000999 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8523b7a8-45d6-4708-b1e7-4c3dbb505640" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001005 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="8523b7a8-45d6-4708-b1e7-4c3dbb505640" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.001032 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5777779d-582f-4e60-ac7f-e194408c31eb" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001040 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="5777779d-582f-4e60-ac7f-e194408c31eb" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.001056 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-httpd" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001068 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-httpd" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.001086 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9159f256-61e8-41bc-bceb-d602b568ef60" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001091 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="9159f256-61e8-41bc-bceb-d602b568ef60" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: E0228 09:21:22.001112 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-log" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001118 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-log" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001547 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1143c7-db81-4638-ad50-a1d7d26d9ad7" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001580 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-httpd" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001607 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="8523b7a8-45d6-4708-b1e7-4c3dbb505640" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001621 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcaf528f-cd30-4024-b73f-da1ac741ee53" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001631 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="9159f256-61e8-41bc-bceb-d602b568ef60" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001643 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="5777779d-582f-4e60-ac7f-e194408c31eb" containerName="mariadb-database-create" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001662 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ec19ad-b746-417b-a573-1b450746e794" containerName="glance-log" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.001676 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" containerName="mariadb-account-create-update" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.019675 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.022253 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.022880 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.022986 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.077800 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078122 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078190 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078221 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078335 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqq2k\" (UniqueName: \"kubernetes.io/projected/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-kube-api-access-nqq2k\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078782 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078858 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.078900 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180698 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180757 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180838 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180885 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180914 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180941 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.180967 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqq2k\" (UniqueName: \"kubernetes.io/projected/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-kube-api-access-nqq2k\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.181152 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.182452 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.182601 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.182752 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.188221 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.189338 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.191058 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.191723 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.202763 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqq2k\" (UniqueName: \"kubernetes.io/projected/8ab8b6d1-f4d6-4206-94a9-14e1770f672a-kube-api-access-nqq2k\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.224170 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ab8b6d1-f4d6-4206-94a9-14e1770f672a\") " pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.350384 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.673413 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ec19ad-b746-417b-a573-1b450746e794" path="/var/lib/kubelet/pods/59ec19ad-b746-417b-a573-1b450746e794/volumes" Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.899902 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 28 09:21:22 crc kubenswrapper[4687]: I0228 09:21:22.902942 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerStarted","Data":"703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a"} Feb 28 09:21:23 crc kubenswrapper[4687]: I0228 09:21:23.912995 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab8b6d1-f4d6-4206-94a9-14e1770f672a","Type":"ContainerStarted","Data":"5bf3dcf262f58d7a713be8d01c99305ca7ba7a2b3b4d5b2e7146ff55e61b013c"} Feb 28 09:21:23 crc kubenswrapper[4687]: I0228 09:21:23.913504 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab8b6d1-f4d6-4206-94a9-14e1770f672a","Type":"ContainerStarted","Data":"915eadb2585bce83439dae861c5d64c4b19d77495e9eebaf131bfb8c4b1cd9cb"} Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.533854 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dm25t"] Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.535358 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.538894 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jb8lr" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.539071 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.539274 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.558884 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dm25t"] Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.645604 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q84tl\" (UniqueName: \"kubernetes.io/projected/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-kube-api-access-q84tl\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.645645 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-scripts\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.646034 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.646164 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-config-data\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.749274 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.749408 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-config-data\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.749587 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q84tl\" (UniqueName: \"kubernetes.io/projected/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-kube-api-access-q84tl\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.749632 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-scripts\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.756123 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-config-data\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.757588 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-scripts\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.757675 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.768561 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q84tl\" (UniqueName: \"kubernetes.io/projected/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-kube-api-access-q84tl\") pod \"nova-cell0-conductor-db-sync-dm25t\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.850327 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.944999 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab8b6d1-f4d6-4206-94a9-14e1770f672a","Type":"ContainerStarted","Data":"c6732fb824f423b971ad5bf0abcc13da2ebb71fa73c2ab2641433b09a8422ae4"} Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.984790 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerStarted","Data":"2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635"} Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.984980 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-central-agent" containerID="cri-o://41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883" gracePeriod=30 Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.985147 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.985342 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="proxy-httpd" containerID="cri-o://2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635" gracePeriod=30 Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.985539 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="sg-core" containerID="cri-o://703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a" gracePeriod=30 Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.985543 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-notification-agent" containerID="cri-o://557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855" gracePeriod=30 Feb 28 09:21:24 crc kubenswrapper[4687]: I0228 09:21:24.989428 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.989411296 podStartE2EDuration="3.989411296s" podCreationTimestamp="2026-02-28 09:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:24.969457927 +0000 UTC m=+1076.660027263" watchObservedRunningTime="2026-02-28 09:21:24.989411296 +0000 UTC m=+1076.679980633" Feb 28 09:21:25 crc kubenswrapper[4687]: I0228 09:21:25.028892 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.941327907 podStartE2EDuration="7.02886764s" podCreationTimestamp="2026-02-28 09:21:18 +0000 UTC" firstStartedPulling="2026-02-28 09:21:19.753333556 +0000 UTC m=+1071.443902893" lastFinishedPulling="2026-02-28 09:21:23.840873289 +0000 UTC m=+1075.531442626" observedRunningTime="2026-02-28 09:21:25.01283404 +0000 UTC m=+1076.703403378" watchObservedRunningTime="2026-02-28 09:21:25.02886764 +0000 UTC m=+1076.719436977" Feb 28 09:21:25 crc kubenswrapper[4687]: I0228 09:21:25.296993 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dm25t"] Feb 28 09:21:25 crc kubenswrapper[4687]: W0228 09:21:25.299834 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c40a499_8f9a_4d0e_b266_4a5defbb7e22.slice/crio-18e7b8d506d533e25b79b2fb155f4d5104a8816ca6f614e1ae9f830ca0cd03d5 WatchSource:0}: Error finding container 18e7b8d506d533e25b79b2fb155f4d5104a8816ca6f614e1ae9f830ca0cd03d5: Status 404 returned error can't find the container with id 18e7b8d506d533e25b79b2fb155f4d5104a8816ca6f614e1ae9f830ca0cd03d5 Feb 28 09:21:25 crc kubenswrapper[4687]: I0228 09:21:25.999312 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dm25t" event={"ID":"1c40a499-8f9a-4d0e-b266-4a5defbb7e22","Type":"ContainerStarted","Data":"18e7b8d506d533e25b79b2fb155f4d5104a8816ca6f614e1ae9f830ca0cd03d5"} Feb 28 09:21:26 crc kubenswrapper[4687]: I0228 09:21:26.003052 4687 generic.go:334] "Generic (PLEG): container finished" podID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerID="2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635" exitCode=0 Feb 28 09:21:26 crc kubenswrapper[4687]: I0228 09:21:26.003106 4687 generic.go:334] "Generic (PLEG): container finished" podID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerID="703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a" exitCode=2 Feb 28 09:21:26 crc kubenswrapper[4687]: I0228 09:21:26.003120 4687 generic.go:334] "Generic (PLEG): container finished" podID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerID="557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855" exitCode=0 Feb 28 09:21:26 crc kubenswrapper[4687]: I0228 09:21:26.003142 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerDied","Data":"2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635"} Feb 28 09:21:26 crc kubenswrapper[4687]: I0228 09:21:26.003207 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerDied","Data":"703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a"} Feb 28 09:21:26 crc kubenswrapper[4687]: I0228 09:21:26.003226 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerDied","Data":"557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855"} Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.666161 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.709422 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-log-httpd\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.709732 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-scripts\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.709754 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-sg-core-conf-yaml\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.709795 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-config-data\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.709913 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-run-httpd\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.709999 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c45jw\" (UniqueName: \"kubernetes.io/projected/7afe8326-fb09-4f40-8a96-b517fe4fad97-kube-api-access-c45jw\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.710089 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-combined-ca-bundle\") pod \"7afe8326-fb09-4f40-8a96-b517fe4fad97\" (UID: \"7afe8326-fb09-4f40-8a96-b517fe4fad97\") " Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.710127 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.710425 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.710728 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.710749 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7afe8326-fb09-4f40-8a96-b517fe4fad97-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.718206 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afe8326-fb09-4f40-8a96-b517fe4fad97-kube-api-access-c45jw" (OuterVolumeSpecName: "kube-api-access-c45jw") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "kube-api-access-c45jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.718934 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-scripts" (OuterVolumeSpecName: "scripts") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.748191 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.786931 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.804323 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-config-data" (OuterVolumeSpecName: "config-data") pod "7afe8326-fb09-4f40-8a96-b517fe4fad97" (UID: "7afe8326-fb09-4f40-8a96-b517fe4fad97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.813145 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.813178 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.813191 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.813201 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afe8326-fb09-4f40-8a96-b517fe4fad97-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:27 crc kubenswrapper[4687]: I0228 09:21:27.813210 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c45jw\" (UniqueName: \"kubernetes.io/projected/7afe8326-fb09-4f40-8a96-b517fe4fad97-kube-api-access-c45jw\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.030876 4687 generic.go:334] "Generic (PLEG): container finished" podID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerID="41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883" exitCode=0 Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.030924 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerDied","Data":"41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883"} Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.030946 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.030964 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7afe8326-fb09-4f40-8a96-b517fe4fad97","Type":"ContainerDied","Data":"a893cd90635b3c008d9736a2edb840c6d2d349b9ea16aaa003e2e927f02871d5"} Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.030988 4687 scope.go:117] "RemoveContainer" containerID="2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.069331 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.079712 4687 scope.go:117] "RemoveContainer" containerID="703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.084739 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098139 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:28 crc kubenswrapper[4687]: E0228 09:21:28.098510 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="proxy-httpd" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098530 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="proxy-httpd" Feb 28 09:21:28 crc kubenswrapper[4687]: E0228 09:21:28.098548 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-central-agent" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098555 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-central-agent" Feb 28 09:21:28 crc kubenswrapper[4687]: E0228 09:21:28.098564 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="sg-core" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098570 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="sg-core" Feb 28 09:21:28 crc kubenswrapper[4687]: E0228 09:21:28.098583 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-notification-agent" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098589 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-notification-agent" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098743 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="proxy-httpd" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098759 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-central-agent" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098771 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="ceilometer-notification-agent" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.098785 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" containerName="sg-core" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.100268 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.102773 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.103281 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.104276 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.106430 4687 scope.go:117] "RemoveContainer" containerID="557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.121722 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-scripts\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.121778 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-config-data\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.121815 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-log-httpd\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.121946 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.121978 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5klvf\" (UniqueName: \"kubernetes.io/projected/ecee8514-9d15-4da0-ad62-416f6e9dc585-kube-api-access-5klvf\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.122173 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-run-httpd\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.122288 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.131809 4687 scope.go:117] "RemoveContainer" containerID="41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.223960 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-run-httpd\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.224128 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.224237 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-scripts\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.224290 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-config-data\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.224317 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-log-httpd\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.224397 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.224421 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5klvf\" (UniqueName: \"kubernetes.io/projected/ecee8514-9d15-4da0-ad62-416f6e9dc585-kube-api-access-5klvf\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.225202 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-run-httpd\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.225199 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-log-httpd\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.230657 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.233426 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-config-data\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.234371 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.239332 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-scripts\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.242482 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5klvf\" (UniqueName: \"kubernetes.io/projected/ecee8514-9d15-4da0-ad62-416f6e9dc585-kube-api-access-5klvf\") pod \"ceilometer-0\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.419568 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:28 crc kubenswrapper[4687]: I0228 09:21:28.669887 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afe8326-fb09-4f40-8a96-b517fe4fad97" path="/var/lib/kubelet/pods/7afe8326-fb09-4f40-8a96-b517fe4fad97/volumes" Feb 28 09:21:29 crc kubenswrapper[4687]: I0228 09:21:29.465206 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 28 09:21:29 crc kubenswrapper[4687]: I0228 09:21:29.465518 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 28 09:21:29 crc kubenswrapper[4687]: I0228 09:21:29.501145 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 28 09:21:29 crc kubenswrapper[4687]: I0228 09:21:29.501310 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 28 09:21:29 crc kubenswrapper[4687]: I0228 09:21:29.569541 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:30 crc kubenswrapper[4687]: I0228 09:21:30.054973 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 28 09:21:30 crc kubenswrapper[4687]: I0228 09:21:30.055048 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 28 09:21:31 crc kubenswrapper[4687]: I0228 09:21:31.758502 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 28 09:21:31 crc kubenswrapper[4687]: I0228 09:21:31.759378 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.351444 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.351842 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.383291 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.387636 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.390989 4687 scope.go:117] "RemoveContainer" containerID="2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635" Feb 28 09:21:32 crc kubenswrapper[4687]: E0228 09:21:32.391526 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635\": container with ID starting with 2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635 not found: ID does not exist" containerID="2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.391567 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635"} err="failed to get container status \"2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635\": rpc error: code = NotFound desc = could not find container \"2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635\": container with ID starting with 2cdd1f1ebbb5cdbc166cc59f88c4a81dcfc379b100343bed44e5d6d58e3b9635 not found: ID does not exist" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.391597 4687 scope.go:117] "RemoveContainer" containerID="703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a" Feb 28 09:21:32 crc kubenswrapper[4687]: E0228 09:21:32.391912 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a\": container with ID starting with 703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a not found: ID does not exist" containerID="703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.391950 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a"} err="failed to get container status \"703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a\": rpc error: code = NotFound desc = could not find container \"703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a\": container with ID starting with 703ce013ab94726b6293f00b8ed37a97ab49556f6b71d75d1f033a92e805677a not found: ID does not exist" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.391979 4687 scope.go:117] "RemoveContainer" containerID="557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855" Feb 28 09:21:32 crc kubenswrapper[4687]: E0228 09:21:32.392622 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855\": container with ID starting with 557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855 not found: ID does not exist" containerID="557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.392648 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855"} err="failed to get container status \"557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855\": rpc error: code = NotFound desc = could not find container \"557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855\": container with ID starting with 557c186963ce1aa9b3a21fc63d441a977277b006bd13d8de5edc7babc1cad855 not found: ID does not exist" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.392662 4687 scope.go:117] "RemoveContainer" containerID="41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883" Feb 28 09:21:32 crc kubenswrapper[4687]: E0228 09:21:32.393007 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883\": container with ID starting with 41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883 not found: ID does not exist" containerID="41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.393044 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883"} err="failed to get container status \"41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883\": rpc error: code = NotFound desc = could not find container \"41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883\": container with ID starting with 41164f446f3658d1e373daf489fe5d22c792e4237c49e43b9704a689ef554883 not found: ID does not exist" Feb 28 09:21:32 crc kubenswrapper[4687]: I0228 09:21:32.828762 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:33 crc kubenswrapper[4687]: I0228 09:21:33.086209 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dm25t" event={"ID":"1c40a499-8f9a-4d0e-b266-4a5defbb7e22","Type":"ContainerStarted","Data":"0578526f9046619d023056e30230dbbd0cb87d8a6b2896535f311cb8c67a00c8"} Feb 28 09:21:33 crc kubenswrapper[4687]: I0228 09:21:33.088826 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerStarted","Data":"08fdfd2f4498aaefdb9bc7115d72837979f1aebf91b0d37c4d531b14ac8e703a"} Feb 28 09:21:33 crc kubenswrapper[4687]: I0228 09:21:33.089267 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:33 crc kubenswrapper[4687]: I0228 09:21:33.089303 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:33 crc kubenswrapper[4687]: I0228 09:21:33.107955 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-dm25t" podStartSLOduration=1.971203013 podStartE2EDuration="9.107932903s" podCreationTimestamp="2026-02-28 09:21:24 +0000 UTC" firstStartedPulling="2026-02-28 09:21:25.302514003 +0000 UTC m=+1076.993083340" lastFinishedPulling="2026-02-28 09:21:32.439243892 +0000 UTC m=+1084.129813230" observedRunningTime="2026-02-28 09:21:33.101455512 +0000 UTC m=+1084.792024849" watchObservedRunningTime="2026-02-28 09:21:33.107932903 +0000 UTC m=+1084.798502240" Feb 28 09:21:34 crc kubenswrapper[4687]: I0228 09:21:34.097143 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerStarted","Data":"f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a"} Feb 28 09:21:34 crc kubenswrapper[4687]: I0228 09:21:34.837555 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:34 crc kubenswrapper[4687]: I0228 09:21:34.840512 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 28 09:21:35 crc kubenswrapper[4687]: I0228 09:21:35.109992 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerStarted","Data":"f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e"} Feb 28 09:21:35 crc kubenswrapper[4687]: I0228 09:21:35.110444 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerStarted","Data":"7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066"} Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.156845 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerStarted","Data":"55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76"} Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.157483 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="proxy-httpd" containerID="cri-o://55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76" gracePeriod=30 Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.157496 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.157044 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-central-agent" containerID="cri-o://f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a" gracePeriod=30 Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.157558 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="sg-core" containerID="cri-o://f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e" gracePeriod=30 Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.157613 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-notification-agent" containerID="cri-o://7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066" gracePeriod=30 Feb 28 09:21:37 crc kubenswrapper[4687]: I0228 09:21:37.190945 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.357763603 podStartE2EDuration="9.190929764s" podCreationTimestamp="2026-02-28 09:21:28 +0000 UTC" firstStartedPulling="2026-02-28 09:21:32.838818481 +0000 UTC m=+1084.529387807" lastFinishedPulling="2026-02-28 09:21:36.671984631 +0000 UTC m=+1088.362553968" observedRunningTime="2026-02-28 09:21:37.18065376 +0000 UTC m=+1088.871223097" watchObservedRunningTime="2026-02-28 09:21:37.190929764 +0000 UTC m=+1088.881499101" Feb 28 09:21:38 crc kubenswrapper[4687]: I0228 09:21:38.167771 4687 generic.go:334] "Generic (PLEG): container finished" podID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerID="55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76" exitCode=0 Feb 28 09:21:38 crc kubenswrapper[4687]: I0228 09:21:38.167812 4687 generic.go:334] "Generic (PLEG): container finished" podID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerID="f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e" exitCode=2 Feb 28 09:21:38 crc kubenswrapper[4687]: I0228 09:21:38.167820 4687 generic.go:334] "Generic (PLEG): container finished" podID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerID="7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066" exitCode=0 Feb 28 09:21:38 crc kubenswrapper[4687]: I0228 09:21:38.167845 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerDied","Data":"55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76"} Feb 28 09:21:38 crc kubenswrapper[4687]: I0228 09:21:38.167872 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerDied","Data":"f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e"} Feb 28 09:21:38 crc kubenswrapper[4687]: I0228 09:21:38.167882 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerDied","Data":"7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066"} Feb 28 09:21:39 crc kubenswrapper[4687]: I0228 09:21:39.176954 4687 generic.go:334] "Generic (PLEG): container finished" podID="1c40a499-8f9a-4d0e-b266-4a5defbb7e22" containerID="0578526f9046619d023056e30230dbbd0cb87d8a6b2896535f311cb8c67a00c8" exitCode=0 Feb 28 09:21:39 crc kubenswrapper[4687]: I0228 09:21:39.177012 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dm25t" event={"ID":"1c40a499-8f9a-4d0e-b266-4a5defbb7e22","Type":"ContainerDied","Data":"0578526f9046619d023056e30230dbbd0cb87d8a6b2896535f311cb8c67a00c8"} Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.484280 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.539063 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q84tl\" (UniqueName: \"kubernetes.io/projected/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-kube-api-access-q84tl\") pod \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.539146 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-combined-ca-bundle\") pod \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.539211 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-scripts\") pod \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.539307 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-config-data\") pod \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\" (UID: \"1c40a499-8f9a-4d0e-b266-4a5defbb7e22\") " Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.547254 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-scripts" (OuterVolumeSpecName: "scripts") pod "1c40a499-8f9a-4d0e-b266-4a5defbb7e22" (UID: "1c40a499-8f9a-4d0e-b266-4a5defbb7e22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.547302 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-kube-api-access-q84tl" (OuterVolumeSpecName: "kube-api-access-q84tl") pod "1c40a499-8f9a-4d0e-b266-4a5defbb7e22" (UID: "1c40a499-8f9a-4d0e-b266-4a5defbb7e22"). InnerVolumeSpecName "kube-api-access-q84tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.566122 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-config-data" (OuterVolumeSpecName: "config-data") pod "1c40a499-8f9a-4d0e-b266-4a5defbb7e22" (UID: "1c40a499-8f9a-4d0e-b266-4a5defbb7e22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.567802 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c40a499-8f9a-4d0e-b266-4a5defbb7e22" (UID: "1c40a499-8f9a-4d0e-b266-4a5defbb7e22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.641857 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.641889 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q84tl\" (UniqueName: \"kubernetes.io/projected/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-kube-api-access-q84tl\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.641903 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:40 crc kubenswrapper[4687]: I0228 09:21:40.641913 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c40a499-8f9a-4d0e-b266-4a5defbb7e22-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.199940 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dm25t" event={"ID":"1c40a499-8f9a-4d0e-b266-4a5defbb7e22","Type":"ContainerDied","Data":"18e7b8d506d533e25b79b2fb155f4d5104a8816ca6f614e1ae9f830ca0cd03d5"} Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.199978 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dm25t" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.200002 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18e7b8d506d533e25b79b2fb155f4d5104a8816ca6f614e1ae9f830ca0cd03d5" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.302906 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 28 09:21:41 crc kubenswrapper[4687]: E0228 09:21:41.303363 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c40a499-8f9a-4d0e-b266-4a5defbb7e22" containerName="nova-cell0-conductor-db-sync" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.303382 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c40a499-8f9a-4d0e-b266-4a5defbb7e22" containerName="nova-cell0-conductor-db-sync" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.303595 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c40a499-8f9a-4d0e-b266-4a5defbb7e22" containerName="nova-cell0-conductor-db-sync" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.306564 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.308510 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.309331 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jb8lr" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.313055 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.354708 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7060db5b-32fc-481f-a4d6-520e585175b7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.354840 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nq7\" (UniqueName: \"kubernetes.io/projected/7060db5b-32fc-481f-a4d6-520e585175b7-kube-api-access-m4nq7\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.354923 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7060db5b-32fc-481f-a4d6-520e585175b7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.456821 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7060db5b-32fc-481f-a4d6-520e585175b7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.456897 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nq7\" (UniqueName: \"kubernetes.io/projected/7060db5b-32fc-481f-a4d6-520e585175b7-kube-api-access-m4nq7\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.456943 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7060db5b-32fc-481f-a4d6-520e585175b7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.462488 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7060db5b-32fc-481f-a4d6-520e585175b7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.463062 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7060db5b-32fc-481f-a4d6-520e585175b7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.472048 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nq7\" (UniqueName: \"kubernetes.io/projected/7060db5b-32fc-481f-a4d6-520e585175b7-kube-api-access-m4nq7\") pod \"nova-cell0-conductor-0\" (UID: \"7060db5b-32fc-481f-a4d6-520e585175b7\") " pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.626054 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:41 crc kubenswrapper[4687]: I0228 09:21:41.991644 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.016764 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.069606 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-config-data\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.069755 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-scripts\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.069811 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-log-httpd\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.069830 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-sg-core-conf-yaml\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.069876 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5klvf\" (UniqueName: \"kubernetes.io/projected/ecee8514-9d15-4da0-ad62-416f6e9dc585-kube-api-access-5klvf\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.069905 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-combined-ca-bundle\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.070009 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-run-httpd\") pod \"ecee8514-9d15-4da0-ad62-416f6e9dc585\" (UID: \"ecee8514-9d15-4da0-ad62-416f6e9dc585\") " Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.070622 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.070658 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.072903 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-scripts" (OuterVolumeSpecName: "scripts") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.073346 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecee8514-9d15-4da0-ad62-416f6e9dc585-kube-api-access-5klvf" (OuterVolumeSpecName: "kube-api-access-5klvf") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "kube-api-access-5klvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.089925 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.127486 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.145324 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-config-data" (OuterVolumeSpecName: "config-data") pod "ecee8514-9d15-4da0-ad62-416f6e9dc585" (UID: "ecee8514-9d15-4da0-ad62-416f6e9dc585"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.172943 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.172977 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.172990 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.173002 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.173012 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5klvf\" (UniqueName: \"kubernetes.io/projected/ecee8514-9d15-4da0-ad62-416f6e9dc585-kube-api-access-5klvf\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.173036 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecee8514-9d15-4da0-ad62-416f6e9dc585-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.173045 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecee8514-9d15-4da0-ad62-416f6e9dc585-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.211441 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7060db5b-32fc-481f-a4d6-520e585175b7","Type":"ContainerStarted","Data":"a77b355f7de95379c7e69adbd69ff7e216e056fe8e99c2318eeb4a1403a9b58a"} Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.211513 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7060db5b-32fc-481f-a4d6-520e585175b7","Type":"ContainerStarted","Data":"167595b530bea769168a0073cc535fe0d9d8f60c5ecd07fe1b9cbf58f40a3ccb"} Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.212326 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.215793 4687 generic.go:334] "Generic (PLEG): container finished" podID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerID="f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a" exitCode=0 Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.215841 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerDied","Data":"f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a"} Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.215871 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecee8514-9d15-4da0-ad62-416f6e9dc585","Type":"ContainerDied","Data":"08fdfd2f4498aaefdb9bc7115d72837979f1aebf91b0d37c4d531b14ac8e703a"} Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.215904 4687 scope.go:117] "RemoveContainer" containerID="55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.215953 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.237382 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.237368938 podStartE2EDuration="1.237368938s" podCreationTimestamp="2026-02-28 09:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:42.227692363 +0000 UTC m=+1093.918261699" watchObservedRunningTime="2026-02-28 09:21:42.237368938 +0000 UTC m=+1093.927938275" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.238801 4687 scope.go:117] "RemoveContainer" containerID="f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.252887 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.254162 4687 scope.go:117] "RemoveContainer" containerID="7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.265895 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.277494 4687 scope.go:117] "RemoveContainer" containerID="f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280094 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.280530 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-notification-agent" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280552 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-notification-agent" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.280572 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="proxy-httpd" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280580 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="proxy-httpd" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.280592 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-central-agent" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280598 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-central-agent" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.280624 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="sg-core" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280630 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="sg-core" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280815 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="sg-core" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280835 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="proxy-httpd" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280845 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-central-agent" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.280857 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" containerName="ceilometer-notification-agent" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.287955 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.288089 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.291480 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.291680 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.298261 4687 scope.go:117] "RemoveContainer" containerID="55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.298692 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76\": container with ID starting with 55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76 not found: ID does not exist" containerID="55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.298728 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76"} err="failed to get container status \"55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76\": rpc error: code = NotFound desc = could not find container \"55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76\": container with ID starting with 55ab72de765c55c9b29eec931114df8c0a03845972f327333ff447579dd17e76 not found: ID does not exist" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.298760 4687 scope.go:117] "RemoveContainer" containerID="f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.299103 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e\": container with ID starting with f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e not found: ID does not exist" containerID="f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.299132 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e"} err="failed to get container status \"f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e\": rpc error: code = NotFound desc = could not find container \"f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e\": container with ID starting with f918d839e3207fea4069f282c90c7ca1994a101367bb0af8987c33265ea88f9e not found: ID does not exist" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.299149 4687 scope.go:117] "RemoveContainer" containerID="7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.299536 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066\": container with ID starting with 7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066 not found: ID does not exist" containerID="7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.299575 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066"} err="failed to get container status \"7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066\": rpc error: code = NotFound desc = could not find container \"7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066\": container with ID starting with 7c505f37a57cf8c6c02665bcd56e80b94483f9dc5c5c9da569ada6c191709066 not found: ID does not exist" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.299605 4687 scope.go:117] "RemoveContainer" containerID="f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a" Feb 28 09:21:42 crc kubenswrapper[4687]: E0228 09:21:42.299881 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a\": container with ID starting with f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a not found: ID does not exist" containerID="f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.299913 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a"} err="failed to get container status \"f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a\": rpc error: code = NotFound desc = could not find container \"f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a\": container with ID starting with f490c97eafadb5ed715f0e75b0c880664ae6e0995ccb70406735ab0dc3c8041a not found: ID does not exist" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376054 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376111 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-config-data\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376131 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376228 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt9bj\" (UniqueName: \"kubernetes.io/projected/77c646e3-3eb4-488f-b3ac-34feb004a255-kube-api-access-pt9bj\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376248 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-run-httpd\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376277 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-scripts\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.376296 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-log-httpd\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.480888 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.480983 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-config-data\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.481035 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.481141 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt9bj\" (UniqueName: \"kubernetes.io/projected/77c646e3-3eb4-488f-b3ac-34feb004a255-kube-api-access-pt9bj\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.481177 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-scripts\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.481201 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-run-httpd\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.481227 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-log-httpd\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.481922 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-log-httpd\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.493569 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-config-data\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.493670 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-run-httpd\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.500960 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.503804 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.504110 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt9bj\" (UniqueName: \"kubernetes.io/projected/77c646e3-3eb4-488f-b3ac-34feb004a255-kube-api-access-pt9bj\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.506684 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-scripts\") pod \"ceilometer-0\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.602178 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:21:42 crc kubenswrapper[4687]: I0228 09:21:42.674276 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecee8514-9d15-4da0-ad62-416f6e9dc585" path="/var/lib/kubelet/pods/ecee8514-9d15-4da0-ad62-416f6e9dc585/volumes" Feb 28 09:21:43 crc kubenswrapper[4687]: I0228 09:21:43.009190 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:21:43 crc kubenswrapper[4687]: W0228 09:21:43.014636 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77c646e3_3eb4_488f_b3ac_34feb004a255.slice/crio-743be384731dcbe1912387bc5b787c933caca7cd4f1c8895eb6923869207e34e WatchSource:0}: Error finding container 743be384731dcbe1912387bc5b787c933caca7cd4f1c8895eb6923869207e34e: Status 404 returned error can't find the container with id 743be384731dcbe1912387bc5b787c933caca7cd4f1c8895eb6923869207e34e Feb 28 09:21:43 crc kubenswrapper[4687]: I0228 09:21:43.224699 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerStarted","Data":"743be384731dcbe1912387bc5b787c933caca7cd4f1c8895eb6923869207e34e"} Feb 28 09:21:44 crc kubenswrapper[4687]: I0228 09:21:44.236378 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerStarted","Data":"97cfe07edbfdd5c85516656057e74fb4dd55cc7612326f2c0fe3cea9c053f637"} Feb 28 09:21:45 crc kubenswrapper[4687]: I0228 09:21:45.250006 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerStarted","Data":"1b94007e9d5a85876a992b2c77fbaa06991dd9a33506f46b528ba91ae8fd3f61"} Feb 28 09:21:47 crc kubenswrapper[4687]: I0228 09:21:47.270385 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerStarted","Data":"9b68d1b548344e31f0e0b0728417dbf05a30e763edfb1cf56588609c5b8bb6d1"} Feb 28 09:21:49 crc kubenswrapper[4687]: I0228 09:21:49.301997 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerStarted","Data":"2b007bb8f25bd7f308fcc2510186221cc1c834a2339de937b245f522cf7ea33e"} Feb 28 09:21:49 crc kubenswrapper[4687]: I0228 09:21:49.302463 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:21:49 crc kubenswrapper[4687]: I0228 09:21:49.337852 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.093616149 podStartE2EDuration="7.337823986s" podCreationTimestamp="2026-02-28 09:21:42 +0000 UTC" firstStartedPulling="2026-02-28 09:21:43.017949108 +0000 UTC m=+1094.708518446" lastFinishedPulling="2026-02-28 09:21:48.262156946 +0000 UTC m=+1099.952726283" observedRunningTime="2026-02-28 09:21:49.325450299 +0000 UTC m=+1101.016019636" watchObservedRunningTime="2026-02-28 09:21:49.337823986 +0000 UTC m=+1101.028393322" Feb 28 09:21:51 crc kubenswrapper[4687]: I0228 09:21:51.650521 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.046286 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-trkpz"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.047638 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.049102 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.050679 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.052401 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-trkpz"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.101584 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sclf5\" (UniqueName: \"kubernetes.io/projected/3cfad4a9-c499-491b-bc53-5346948e6e2a-kube-api-access-sclf5\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.101633 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-scripts\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.101683 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-config-data\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.101710 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.190236 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.191359 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.193214 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.203124 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sclf5\" (UniqueName: \"kubernetes.io/projected/3cfad4a9-c499-491b-bc53-5346948e6e2a-kube-api-access-sclf5\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.203163 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-scripts\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.203209 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-config-data\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.203230 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.207404 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.211890 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-scripts\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.212516 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.213619 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-config-data\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.231088 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.232400 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.234208 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.243491 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sclf5\" (UniqueName: \"kubernetes.io/projected/3cfad4a9-c499-491b-bc53-5346948e6e2a-kube-api-access-sclf5\") pod \"nova-cell0-cell-mapping-trkpz\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.284399 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.306545 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-config-data\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.306737 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hhpd\" (UniqueName: \"kubernetes.io/projected/8f4c2175-2584-4766-94f7-f775a2ea6fa1-kube-api-access-2hhpd\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.306825 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.306910 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.307065 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.307174 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfwdw\" (UniqueName: \"kubernetes.io/projected/5129f869-98af-4a0f-ae3a-c3ac815078bc-kube-api-access-vfwdw\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.318109 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.319747 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.326552 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.326644 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.363289 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409542 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/c9ce0b5b-0146-4809-9f5c-4e5547929f28-kube-api-access-xc6w9\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409588 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409639 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9ce0b5b-0146-4809-9f5c-4e5547929f28-logs\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409670 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfwdw\" (UniqueName: \"kubernetes.io/projected/5129f869-98af-4a0f-ae3a-c3ac815078bc-kube-api-access-vfwdw\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409707 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-config-data\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409793 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409816 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-config-data\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409873 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hhpd\" (UniqueName: \"kubernetes.io/projected/8f4c2175-2584-4766-94f7-f775a2ea6fa1-kube-api-access-2hhpd\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409903 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.409931 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.413627 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.413676 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.414630 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-config-data\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.420541 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.430494 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hhpd\" (UniqueName: \"kubernetes.io/projected/8f4c2175-2584-4766-94f7-f775a2ea6fa1-kube-api-access-2hhpd\") pod \"nova-cell1-novncproxy-0\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.434297 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfwdw\" (UniqueName: \"kubernetes.io/projected/5129f869-98af-4a0f-ae3a-c3ac815078bc-kube-api-access-vfwdw\") pod \"nova-scheduler-0\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.461996 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.463891 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.466454 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.472964 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.507816 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.514764 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.514892 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/c9ce0b5b-0146-4809-9f5c-4e5547929f28-kube-api-access-xc6w9\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.514933 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9ce0b5b-0146-4809-9f5c-4e5547929f28-logs\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.514968 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-config-data\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.520447 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9ce0b5b-0146-4809-9f5c-4e5547929f28-logs\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.522144 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-config-data\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.527613 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.529060 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-84mdh"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.530598 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.554859 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/c9ce0b5b-0146-4809-9f5c-4e5547929f28-kube-api-access-xc6w9\") pod \"nova-api-0\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.589318 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-84mdh"] Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.609457 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.623832 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.624465 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s4t5\" (UniqueName: \"kubernetes.io/projected/cea4c574-1a0a-439a-a8bb-b13d1ff35676-kube-api-access-8s4t5\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.624687 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-config-data\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.624768 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea4c574-1a0a-439a-a8bb-b13d1ff35676-logs\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.642401 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.726986 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727238 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727286 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-config-data\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727334 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf492\" (UniqueName: \"kubernetes.io/projected/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-kube-api-access-jf492\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727355 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea4c574-1a0a-439a-a8bb-b13d1ff35676-logs\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727396 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727430 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727449 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s4t5\" (UniqueName: \"kubernetes.io/projected/cea4c574-1a0a-439a-a8bb-b13d1ff35676-kube-api-access-8s4t5\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727475 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-config\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.727501 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.728620 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea4c574-1a0a-439a-a8bb-b13d1ff35676-logs\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.734745 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-config-data\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.736549 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.741687 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s4t5\" (UniqueName: \"kubernetes.io/projected/cea4c574-1a0a-439a-a8bb-b13d1ff35676-kube-api-access-8s4t5\") pod \"nova-metadata-0\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.808899 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.829902 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf492\" (UniqueName: \"kubernetes.io/projected/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-kube-api-access-jf492\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.830006 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.830176 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-config\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.830214 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.830381 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.830404 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.831112 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.833937 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.834104 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.834137 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.834280 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-config\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.856240 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf492\" (UniqueName: \"kubernetes.io/projected/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-kube-api-access-jf492\") pod \"dnsmasq-dns-7bd5679c8c-84mdh\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.876253 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:52 crc kubenswrapper[4687]: I0228 09:21:52.928406 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-trkpz"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.047249 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:21:53 crc kubenswrapper[4687]: W0228 09:21:53.049537 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5129f869_98af_4a0f_ae3a_c3ac815078bc.slice/crio-36af6469ceb40ac6db878ea8585cd9fe2e75e95da8c3c4f0e9e8fc191e6da661 WatchSource:0}: Error finding container 36af6469ceb40ac6db878ea8585cd9fe2e75e95da8c3c4f0e9e8fc191e6da661: Status 404 returned error can't find the container with id 36af6469ceb40ac6db878ea8585cd9fe2e75e95da8c3c4f0e9e8fc191e6da661 Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.145108 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.152728 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.210075 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kz8x8"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.211381 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.212841 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.213223 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.221924 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kz8x8"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.245198 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-config-data\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.245241 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn5pm\" (UniqueName: \"kubernetes.io/projected/f368345f-9e9f-448e-af56-24950cc3b1f9-kube-api-access-cn5pm\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.245267 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.245373 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-scripts\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: W0228 09:21:53.304943 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea4c574_1a0a_439a_a8bb_b13d1ff35676.slice/crio-57464dcdb397c12080a994ada2341e85f68ed84fbd40ae0b21d5380763df720c WatchSource:0}: Error finding container 57464dcdb397c12080a994ada2341e85f68ed84fbd40ae0b21d5380763df720c: Status 404 returned error can't find the container with id 57464dcdb397c12080a994ada2341e85f68ed84fbd40ae0b21d5380763df720c Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.308293 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.338607 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f4c2175-2584-4766-94f7-f775a2ea6fa1","Type":"ContainerStarted","Data":"454143e01ecc207caabf439856b6c237ddb242d0cb1691b542105a3ca97c3e14"} Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.339530 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea4c574-1a0a-439a-a8bb-b13d1ff35676","Type":"ContainerStarted","Data":"57464dcdb397c12080a994ada2341e85f68ed84fbd40ae0b21d5380763df720c"} Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.341096 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9ce0b5b-0146-4809-9f5c-4e5547929f28","Type":"ContainerStarted","Data":"40ab86a451d0a40dad199d7f5907caee0533cce208161145e08148ebe0ba2132"} Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.342491 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-trkpz" event={"ID":"3cfad4a9-c499-491b-bc53-5346948e6e2a","Type":"ContainerStarted","Data":"799ff8485be9db4864a4e0a9ee0295a8bb1f7cfff3160ec45b8e4b43cb6ec244"} Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.342520 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-trkpz" event={"ID":"3cfad4a9-c499-491b-bc53-5346948e6e2a","Type":"ContainerStarted","Data":"13f245a9ccbc4740a4d7346d199142e44ff4df15fff7518c2804211936aec985"} Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.344511 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5129f869-98af-4a0f-ae3a-c3ac815078bc","Type":"ContainerStarted","Data":"36af6469ceb40ac6db878ea8585cd9fe2e75e95da8c3c4f0e9e8fc191e6da661"} Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.347861 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-config-data\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.347901 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn5pm\" (UniqueName: \"kubernetes.io/projected/f368345f-9e9f-448e-af56-24950cc3b1f9-kube-api-access-cn5pm\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.347931 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.348064 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-scripts\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.355662 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.355822 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-config-data\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.355900 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-scripts\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.363560 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-trkpz" podStartSLOduration=1.363542311 podStartE2EDuration="1.363542311s" podCreationTimestamp="2026-02-28 09:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:53.360126078 +0000 UTC m=+1105.050695425" watchObservedRunningTime="2026-02-28 09:21:53.363542311 +0000 UTC m=+1105.054111649" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.369433 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn5pm\" (UniqueName: \"kubernetes.io/projected/f368345f-9e9f-448e-af56-24950cc3b1f9-kube-api-access-cn5pm\") pod \"nova-cell1-conductor-db-sync-kz8x8\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.400154 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-84mdh"] Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.571207 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:53 crc kubenswrapper[4687]: I0228 09:21:53.974084 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kz8x8"] Feb 28 09:21:53 crc kubenswrapper[4687]: W0228 09:21:53.977431 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf368345f_9e9f_448e_af56_24950cc3b1f9.slice/crio-7fbf15fe5de4eb8403011be08d3daa9ad43c129a15ae242b0516952ae8d8c05b WatchSource:0}: Error finding container 7fbf15fe5de4eb8403011be08d3daa9ad43c129a15ae242b0516952ae8d8c05b: Status 404 returned error can't find the container with id 7fbf15fe5de4eb8403011be08d3daa9ad43c129a15ae242b0516952ae8d8c05b Feb 28 09:21:54 crc kubenswrapper[4687]: I0228 09:21:54.354769 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" event={"ID":"f368345f-9e9f-448e-af56-24950cc3b1f9","Type":"ContainerStarted","Data":"fe67e581c0171a0d36c34a7aaf9063b4b6785236125470ebd698a17fb0341b72"} Feb 28 09:21:54 crc kubenswrapper[4687]: I0228 09:21:54.355000 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" event={"ID":"f368345f-9e9f-448e-af56-24950cc3b1f9","Type":"ContainerStarted","Data":"7fbf15fe5de4eb8403011be08d3daa9ad43c129a15ae242b0516952ae8d8c05b"} Feb 28 09:21:54 crc kubenswrapper[4687]: I0228 09:21:54.362229 4687 generic.go:334] "Generic (PLEG): container finished" podID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerID="0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed" exitCode=0 Feb 28 09:21:54 crc kubenswrapper[4687]: I0228 09:21:54.362308 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" event={"ID":"56ba6f0a-8cc9-41a9-9444-5e338bd8a300","Type":"ContainerDied","Data":"0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed"} Feb 28 09:21:54 crc kubenswrapper[4687]: I0228 09:21:54.362337 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" event={"ID":"56ba6f0a-8cc9-41a9-9444-5e338bd8a300","Type":"ContainerStarted","Data":"1e5a8baa59919f6ec0c426d39c8edd513acb42eb01510b2ededf8fb479aaebe9"} Feb 28 09:21:54 crc kubenswrapper[4687]: I0228 09:21:54.385651 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" podStartSLOduration=1.385627617 podStartE2EDuration="1.385627617s" podCreationTimestamp="2026-02-28 09:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:54.37205824 +0000 UTC m=+1106.062627587" watchObservedRunningTime="2026-02-28 09:21:54.385627617 +0000 UTC m=+1106.076196953" Feb 28 09:21:55 crc kubenswrapper[4687]: I0228 09:21:55.005264 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:21:55 crc kubenswrapper[4687]: I0228 09:21:55.005328 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.092107 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.115009 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.396155 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5129f869-98af-4a0f-ae3a-c3ac815078bc","Type":"ContainerStarted","Data":"cbe6bcb4183df4aff06fa3e028510922ed8a7f4b6b2caf136db98f99b91ccd73"} Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.399572 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f4c2175-2584-4766-94f7-f775a2ea6fa1","Type":"ContainerStarted","Data":"2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df"} Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.404406 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea4c574-1a0a-439a-a8bb-b13d1ff35676","Type":"ContainerStarted","Data":"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675"} Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.405934 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9ce0b5b-0146-4809-9f5c-4e5547929f28","Type":"ContainerStarted","Data":"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76"} Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.412436 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.216231094 podStartE2EDuration="4.412424107s" podCreationTimestamp="2026-02-28 09:21:52 +0000 UTC" firstStartedPulling="2026-02-28 09:21:53.052879953 +0000 UTC m=+1104.743449290" lastFinishedPulling="2026-02-28 09:21:55.249072966 +0000 UTC m=+1106.939642303" observedRunningTime="2026-02-28 09:21:56.409988878 +0000 UTC m=+1108.100558225" watchObservedRunningTime="2026-02-28 09:21:56.412424107 +0000 UTC m=+1108.102993444" Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.415687 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" event={"ID":"56ba6f0a-8cc9-41a9-9444-5e338bd8a300","Type":"ContainerStarted","Data":"9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45"} Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.416562 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.431342 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.332905578 podStartE2EDuration="4.431322934s" podCreationTimestamp="2026-02-28 09:21:52 +0000 UTC" firstStartedPulling="2026-02-28 09:21:53.155795595 +0000 UTC m=+1104.846364932" lastFinishedPulling="2026-02-28 09:21:55.254212951 +0000 UTC m=+1106.944782288" observedRunningTime="2026-02-28 09:21:56.429381634 +0000 UTC m=+1108.119950971" watchObservedRunningTime="2026-02-28 09:21:56.431322934 +0000 UTC m=+1108.121892271" Feb 28 09:21:56 crc kubenswrapper[4687]: I0228 09:21:56.451217 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" podStartSLOduration=4.451204909 podStartE2EDuration="4.451204909s" podCreationTimestamp="2026-02-28 09:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:21:56.44725929 +0000 UTC m=+1108.137828627" watchObservedRunningTime="2026-02-28 09:21:56.451204909 +0000 UTC m=+1108.141774246" Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.429868 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea4c574-1a0a-439a-a8bb-b13d1ff35676","Type":"ContainerStarted","Data":"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87"} Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.430721 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-log" containerID="cri-o://8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675" gracePeriod=30 Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.431405 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-metadata" containerID="cri-o://dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87" gracePeriod=30 Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.433426 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9ce0b5b-0146-4809-9f5c-4e5547929f28","Type":"ContainerStarted","Data":"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc"} Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.433555 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8f4c2175-2584-4766-94f7-f775a2ea6fa1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df" gracePeriod=30 Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.459226 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.651639874 podStartE2EDuration="5.459211471s" podCreationTimestamp="2026-02-28 09:21:52 +0000 UTC" firstStartedPulling="2026-02-28 09:21:53.308721476 +0000 UTC m=+1104.999290813" lastFinishedPulling="2026-02-28 09:21:56.116293073 +0000 UTC m=+1107.806862410" observedRunningTime="2026-02-28 09:21:57.452710869 +0000 UTC m=+1109.143280205" watchObservedRunningTime="2026-02-28 09:21:57.459211471 +0000 UTC m=+1109.149780809" Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.479004 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.528106643 podStartE2EDuration="5.478973602s" podCreationTimestamp="2026-02-28 09:21:52 +0000 UTC" firstStartedPulling="2026-02-28 09:21:53.167265494 +0000 UTC m=+1104.857834830" lastFinishedPulling="2026-02-28 09:21:56.118132452 +0000 UTC m=+1107.808701789" observedRunningTime="2026-02-28 09:21:57.474626448 +0000 UTC m=+1109.165195785" watchObservedRunningTime="2026-02-28 09:21:57.478973602 +0000 UTC m=+1109.169542939" Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.508462 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.611296 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.809348 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 28 09:21:57 crc kubenswrapper[4687]: I0228 09:21:57.809391 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.041919 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.045759 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166232 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hhpd\" (UniqueName: \"kubernetes.io/projected/8f4c2175-2584-4766-94f7-f775a2ea6fa1-kube-api-access-2hhpd\") pod \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166406 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-config-data\") pod \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166439 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-combined-ca-bundle\") pod \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166467 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-config-data\") pod \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166608 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-combined-ca-bundle\") pod \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\" (UID: \"8f4c2175-2584-4766-94f7-f775a2ea6fa1\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166665 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea4c574-1a0a-439a-a8bb-b13d1ff35676-logs\") pod \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.166715 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s4t5\" (UniqueName: \"kubernetes.io/projected/cea4c574-1a0a-439a-a8bb-b13d1ff35676-kube-api-access-8s4t5\") pod \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\" (UID: \"cea4c574-1a0a-439a-a8bb-b13d1ff35676\") " Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.168049 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea4c574-1a0a-439a-a8bb-b13d1ff35676-logs" (OuterVolumeSpecName: "logs") pod "cea4c574-1a0a-439a-a8bb-b13d1ff35676" (UID: "cea4c574-1a0a-439a-a8bb-b13d1ff35676"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.173541 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4c2175-2584-4766-94f7-f775a2ea6fa1-kube-api-access-2hhpd" (OuterVolumeSpecName: "kube-api-access-2hhpd") pod "8f4c2175-2584-4766-94f7-f775a2ea6fa1" (UID: "8f4c2175-2584-4766-94f7-f775a2ea6fa1"). InnerVolumeSpecName "kube-api-access-2hhpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.173679 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea4c574-1a0a-439a-a8bb-b13d1ff35676-kube-api-access-8s4t5" (OuterVolumeSpecName: "kube-api-access-8s4t5") pod "cea4c574-1a0a-439a-a8bb-b13d1ff35676" (UID: "cea4c574-1a0a-439a-a8bb-b13d1ff35676"). InnerVolumeSpecName "kube-api-access-8s4t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.192350 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea4c574-1a0a-439a-a8bb-b13d1ff35676" (UID: "cea4c574-1a0a-439a-a8bb-b13d1ff35676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.193405 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-config-data" (OuterVolumeSpecName: "config-data") pod "cea4c574-1a0a-439a-a8bb-b13d1ff35676" (UID: "cea4c574-1a0a-439a-a8bb-b13d1ff35676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.193990 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f4c2175-2584-4766-94f7-f775a2ea6fa1" (UID: "8f4c2175-2584-4766-94f7-f775a2ea6fa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.196095 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-config-data" (OuterVolumeSpecName: "config-data") pod "8f4c2175-2584-4766-94f7-f775a2ea6fa1" (UID: "8f4c2175-2584-4766-94f7-f775a2ea6fa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.269927 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hhpd\" (UniqueName: \"kubernetes.io/projected/8f4c2175-2584-4766-94f7-f775a2ea6fa1-kube-api-access-2hhpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.269965 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.270075 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.270088 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea4c574-1a0a-439a-a8bb-b13d1ff35676-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.270099 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4c2175-2584-4766-94f7-f775a2ea6fa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.270108 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea4c574-1a0a-439a-a8bb-b13d1ff35676-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.270118 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s4t5\" (UniqueName: \"kubernetes.io/projected/cea4c574-1a0a-439a-a8bb-b13d1ff35676-kube-api-access-8s4t5\") on node \"crc\" DevicePath \"\"" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445354 4687 generic.go:334] "Generic (PLEG): container finished" podID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerID="dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87" exitCode=0 Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445398 4687 generic.go:334] "Generic (PLEG): container finished" podID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerID="8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675" exitCode=143 Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445407 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea4c574-1a0a-439a-a8bb-b13d1ff35676","Type":"ContainerDied","Data":"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87"} Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445437 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445475 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea4c574-1a0a-439a-a8bb-b13d1ff35676","Type":"ContainerDied","Data":"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675"} Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445491 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea4c574-1a0a-439a-a8bb-b13d1ff35676","Type":"ContainerDied","Data":"57464dcdb397c12080a994ada2341e85f68ed84fbd40ae0b21d5380763df720c"} Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.445514 4687 scope.go:117] "RemoveContainer" containerID="dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.447215 4687 generic.go:334] "Generic (PLEG): container finished" podID="8f4c2175-2584-4766-94f7-f775a2ea6fa1" containerID="2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df" exitCode=0 Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.447280 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f4c2175-2584-4766-94f7-f775a2ea6fa1","Type":"ContainerDied","Data":"2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df"} Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.447301 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8f4c2175-2584-4766-94f7-f775a2ea6fa1","Type":"ContainerDied","Data":"454143e01ecc207caabf439856b6c237ddb242d0cb1691b542105a3ca97c3e14"} Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.447493 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.449457 4687 generic.go:334] "Generic (PLEG): container finished" podID="f368345f-9e9f-448e-af56-24950cc3b1f9" containerID="fe67e581c0171a0d36c34a7aaf9063b4b6785236125470ebd698a17fb0341b72" exitCode=0 Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.449523 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" event={"ID":"f368345f-9e9f-448e-af56-24950cc3b1f9","Type":"ContainerDied","Data":"fe67e581c0171a0d36c34a7aaf9063b4b6785236125470ebd698a17fb0341b72"} Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.472549 4687 scope.go:117] "RemoveContainer" containerID="8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.500828 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.504126 4687 scope.go:117] "RemoveContainer" containerID="dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87" Feb 28 09:21:58 crc kubenswrapper[4687]: E0228 09:21:58.506649 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87\": container with ID starting with dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87 not found: ID does not exist" containerID="dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.506707 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87"} err="failed to get container status \"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87\": rpc error: code = NotFound desc = could not find container \"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87\": container with ID starting with dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87 not found: ID does not exist" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.506744 4687 scope.go:117] "RemoveContainer" containerID="8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675" Feb 28 09:21:58 crc kubenswrapper[4687]: E0228 09:21:58.509617 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675\": container with ID starting with 8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675 not found: ID does not exist" containerID="8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.509755 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675"} err="failed to get container status \"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675\": rpc error: code = NotFound desc = could not find container \"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675\": container with ID starting with 8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675 not found: ID does not exist" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.509847 4687 scope.go:117] "RemoveContainer" containerID="dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.512527 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87"} err="failed to get container status \"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87\": rpc error: code = NotFound desc = could not find container \"dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87\": container with ID starting with dd92509f10111332b0721fb941a87088b634ace87274e307d052c8a32a4afe87 not found: ID does not exist" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.512597 4687 scope.go:117] "RemoveContainer" containerID="8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.514162 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675"} err="failed to get container status \"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675\": rpc error: code = NotFound desc = could not find container \"8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675\": container with ID starting with 8c5e8f89e1ca4dd03c6d483b4b92bb5f224375a766d7be33859940495ab86675 not found: ID does not exist" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.514206 4687 scope.go:117] "RemoveContainer" containerID="2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.515004 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.522571 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.538472 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.566780 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: E0228 09:21:58.567478 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-log" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.567502 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-log" Feb 28 09:21:58 crc kubenswrapper[4687]: E0228 09:21:58.567534 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-metadata" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.567541 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-metadata" Feb 28 09:21:58 crc kubenswrapper[4687]: E0228 09:21:58.567555 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4c2175-2584-4766-94f7-f775a2ea6fa1" containerName="nova-cell1-novncproxy-novncproxy" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.567561 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4c2175-2584-4766-94f7-f775a2ea6fa1" containerName="nova-cell1-novncproxy-novncproxy" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.567766 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4c2175-2584-4766-94f7-f775a2ea6fa1" containerName="nova-cell1-novncproxy-novncproxy" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.567787 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-metadata" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.567802 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" containerName="nova-metadata-log" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.568476 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.577244 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.577495 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.577651 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.579324 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.579393 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.579423 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.579449 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz79j\" (UniqueName: \"kubernetes.io/projected/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-kube-api-access-cz79j\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.579467 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.580426 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.581662 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.584210 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.587305 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.588431 4687 scope.go:117] "RemoveContainer" containerID="2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df" Feb 28 09:21:58 crc kubenswrapper[4687]: E0228 09:21:58.590165 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df\": container with ID starting with 2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df not found: ID does not exist" containerID="2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.590209 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df"} err="failed to get container status \"2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df\": rpc error: code = NotFound desc = could not find container \"2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df\": container with ID starting with 2199a1efe592055e0d74a1b2e0d74874a85201a8ea8678252d3786a50550d3df not found: ID does not exist" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.597126 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.607591 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.689535 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.689641 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.689683 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.689717 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz79j\" (UniqueName: \"kubernetes.io/projected/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-kube-api-access-cz79j\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.689734 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.693872 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4c2175-2584-4766-94f7-f775a2ea6fa1" path="/var/lib/kubelet/pods/8f4c2175-2584-4766-94f7-f775a2ea6fa1/volumes" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.694719 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea4c574-1a0a-439a-a8bb-b13d1ff35676" path="/var/lib/kubelet/pods/cea4c574-1a0a-439a-a8bb-b13d1ff35676/volumes" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.714223 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.716186 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.719529 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.726535 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.733458 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz79j\" (UniqueName: \"kubernetes.io/projected/02b56b91-2ca9-4bea-b8d4-ad653daa91b8-kube-api-access-cz79j\") pod \"nova-cell1-novncproxy-0\" (UID: \"02b56b91-2ca9-4bea-b8d4-ad653daa91b8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.791741 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54619cb5-80eb-4995-99b7-fdd217f640f9-logs\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.791969 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-config-data\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.792231 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.792294 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdj4j\" (UniqueName: \"kubernetes.io/projected/54619cb5-80eb-4995-99b7-fdd217f640f9-kube-api-access-zdj4j\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.792437 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.894373 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-config-data\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.894460 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.894492 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdj4j\" (UniqueName: \"kubernetes.io/projected/54619cb5-80eb-4995-99b7-fdd217f640f9-kube-api-access-zdj4j\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.894556 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.894584 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54619cb5-80eb-4995-99b7-fdd217f640f9-logs\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.894983 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54619cb5-80eb-4995-99b7-fdd217f640f9-logs\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.898009 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.898498 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-config-data\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.899336 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.908357 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdj4j\" (UniqueName: \"kubernetes.io/projected/54619cb5-80eb-4995-99b7-fdd217f640f9-kube-api-access-zdj4j\") pod \"nova-metadata-0\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " pod="openstack/nova-metadata-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.927598 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:21:58 crc kubenswrapper[4687]: I0228 09:21:58.993276 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.356196 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 28 09:21:59 crc kubenswrapper[4687]: W0228 09:21:59.406044 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02b56b91_2ca9_4bea_b8d4_ad653daa91b8.slice/crio-04a144384415ee4d339c88652d5c1522f67f584039e950520cc8217eb614b42e WatchSource:0}: Error finding container 04a144384415ee4d339c88652d5c1522f67f584039e950520cc8217eb614b42e: Status 404 returned error can't find the container with id 04a144384415ee4d339c88652d5c1522f67f584039e950520cc8217eb614b42e Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.424437 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:21:59 crc kubenswrapper[4687]: W0228 09:21:59.425442 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54619cb5_80eb_4995_99b7_fdd217f640f9.slice/crio-4e007bf648202f001aae15e7f346d1c2d1be66134b27b41a056c25b6dc581578 WatchSource:0}: Error finding container 4e007bf648202f001aae15e7f346d1c2d1be66134b27b41a056c25b6dc581578: Status 404 returned error can't find the container with id 4e007bf648202f001aae15e7f346d1c2d1be66134b27b41a056c25b6dc581578 Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.464388 4687 generic.go:334] "Generic (PLEG): container finished" podID="3cfad4a9-c499-491b-bc53-5346948e6e2a" containerID="799ff8485be9db4864a4e0a9ee0295a8bb1f7cfff3160ec45b8e4b43cb6ec244" exitCode=0 Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.464486 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-trkpz" event={"ID":"3cfad4a9-c499-491b-bc53-5346948e6e2a","Type":"ContainerDied","Data":"799ff8485be9db4864a4e0a9ee0295a8bb1f7cfff3160ec45b8e4b43cb6ec244"} Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.466282 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"02b56b91-2ca9-4bea-b8d4-ad653daa91b8","Type":"ContainerStarted","Data":"04a144384415ee4d339c88652d5c1522f67f584039e950520cc8217eb614b42e"} Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.469052 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54619cb5-80eb-4995-99b7-fdd217f640f9","Type":"ContainerStarted","Data":"4e007bf648202f001aae15e7f346d1c2d1be66134b27b41a056c25b6dc581578"} Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.808902 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.931095 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-config-data\") pod \"f368345f-9e9f-448e-af56-24950cc3b1f9\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.931549 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn5pm\" (UniqueName: \"kubernetes.io/projected/f368345f-9e9f-448e-af56-24950cc3b1f9-kube-api-access-cn5pm\") pod \"f368345f-9e9f-448e-af56-24950cc3b1f9\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.931595 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-combined-ca-bundle\") pod \"f368345f-9e9f-448e-af56-24950cc3b1f9\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.931619 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-scripts\") pod \"f368345f-9e9f-448e-af56-24950cc3b1f9\" (UID: \"f368345f-9e9f-448e-af56-24950cc3b1f9\") " Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.935941 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f368345f-9e9f-448e-af56-24950cc3b1f9-kube-api-access-cn5pm" (OuterVolumeSpecName: "kube-api-access-cn5pm") pod "f368345f-9e9f-448e-af56-24950cc3b1f9" (UID: "f368345f-9e9f-448e-af56-24950cc3b1f9"). InnerVolumeSpecName "kube-api-access-cn5pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.938184 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-scripts" (OuterVolumeSpecName: "scripts") pod "f368345f-9e9f-448e-af56-24950cc3b1f9" (UID: "f368345f-9e9f-448e-af56-24950cc3b1f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.956806 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f368345f-9e9f-448e-af56-24950cc3b1f9" (UID: "f368345f-9e9f-448e-af56-24950cc3b1f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:21:59 crc kubenswrapper[4687]: I0228 09:21:59.958330 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-config-data" (OuterVolumeSpecName: "config-data") pod "f368345f-9e9f-448e-af56-24950cc3b1f9" (UID: "f368345f-9e9f-448e-af56-24950cc3b1f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.033872 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn5pm\" (UniqueName: \"kubernetes.io/projected/f368345f-9e9f-448e-af56-24950cc3b1f9-kube-api-access-cn5pm\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.033909 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.033920 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.033929 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f368345f-9e9f-448e-af56-24950cc3b1f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.149031 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537842-cgjsw"] Feb 28 09:22:00 crc kubenswrapper[4687]: E0228 09:22:00.149505 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f368345f-9e9f-448e-af56-24950cc3b1f9" containerName="nova-cell1-conductor-db-sync" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.149523 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f368345f-9e9f-448e-af56-24950cc3b1f9" containerName="nova-cell1-conductor-db-sync" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.149773 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f368345f-9e9f-448e-af56-24950cc3b1f9" containerName="nova-cell1-conductor-db-sync" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.150462 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.152472 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.152523 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.152642 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.155517 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537842-cgjsw"] Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.339430 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62t6n\" (UniqueName: \"kubernetes.io/projected/70abdfed-0686-450a-b900-2eda9b68cec7-kube-api-access-62t6n\") pod \"auto-csr-approver-29537842-cgjsw\" (UID: \"70abdfed-0686-450a-b900-2eda9b68cec7\") " pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.441207 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62t6n\" (UniqueName: \"kubernetes.io/projected/70abdfed-0686-450a-b900-2eda9b68cec7-kube-api-access-62t6n\") pod \"auto-csr-approver-29537842-cgjsw\" (UID: \"70abdfed-0686-450a-b900-2eda9b68cec7\") " pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.456090 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62t6n\" (UniqueName: \"kubernetes.io/projected/70abdfed-0686-450a-b900-2eda9b68cec7-kube-api-access-62t6n\") pod \"auto-csr-approver-29537842-cgjsw\" (UID: \"70abdfed-0686-450a-b900-2eda9b68cec7\") " pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.464756 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.481715 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"02b56b91-2ca9-4bea-b8d4-ad653daa91b8","Type":"ContainerStarted","Data":"bcea6f13adacbefff1578d07da70ade6eecf36b8c464dd033ba3ea093a7021fe"} Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.484433 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54619cb5-80eb-4995-99b7-fdd217f640f9","Type":"ContainerStarted","Data":"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d"} Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.484461 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54619cb5-80eb-4995-99b7-fdd217f640f9","Type":"ContainerStarted","Data":"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110"} Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.487540 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" event={"ID":"f368345f-9e9f-448e-af56-24950cc3b1f9","Type":"ContainerDied","Data":"7fbf15fe5de4eb8403011be08d3daa9ad43c129a15ae242b0516952ae8d8c05b"} Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.487559 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kz8x8" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.487757 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fbf15fe5de4eb8403011be08d3daa9ad43c129a15ae242b0516952ae8d8c05b" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.529259 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.529237116 podStartE2EDuration="2.529237116s" podCreationTimestamp="2026-02-28 09:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:00.500587413 +0000 UTC m=+1112.191156770" watchObservedRunningTime="2026-02-28 09:22:00.529237116 +0000 UTC m=+1112.219806452" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.540923 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.540899435 podStartE2EDuration="2.540899435s" podCreationTimestamp="2026-02-28 09:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:00.519763021 +0000 UTC m=+1112.210332368" watchObservedRunningTime="2026-02-28 09:22:00.540899435 +0000 UTC m=+1112.231468772" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.552643 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.553918 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.555782 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.558529 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.645708 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.646221 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq56d\" (UniqueName: \"kubernetes.io/projected/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-kube-api-access-lq56d\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.646386 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.748466 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq56d\" (UniqueName: \"kubernetes.io/projected/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-kube-api-access-lq56d\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.748560 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.748675 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.752740 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.761555 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.770911 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq56d\" (UniqueName: \"kubernetes.io/projected/e45dcf0c-b04a-4ae5-9488-2051b3ea91df-kube-api-access-lq56d\") pod \"nova-cell1-conductor-0\" (UID: \"e45dcf0c-b04a-4ae5-9488-2051b3ea91df\") " pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.868310 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.909594 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:00 crc kubenswrapper[4687]: I0228 09:22:00.973939 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537842-cgjsw"] Feb 28 09:22:00 crc kubenswrapper[4687]: W0228 09:22:00.984837 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70abdfed_0686_450a_b900_2eda9b68cec7.slice/crio-7207360c9276bcd57d8172654884778be47d9e32a5b1bafb90c3da4f725bf2ef WatchSource:0}: Error finding container 7207360c9276bcd57d8172654884778be47d9e32a5b1bafb90c3da4f725bf2ef: Status 404 returned error can't find the container with id 7207360c9276bcd57d8172654884778be47d9e32a5b1bafb90c3da4f725bf2ef Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.054458 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sclf5\" (UniqueName: \"kubernetes.io/projected/3cfad4a9-c499-491b-bc53-5346948e6e2a-kube-api-access-sclf5\") pod \"3cfad4a9-c499-491b-bc53-5346948e6e2a\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.054573 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-config-data\") pod \"3cfad4a9-c499-491b-bc53-5346948e6e2a\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.054858 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-scripts\") pod \"3cfad4a9-c499-491b-bc53-5346948e6e2a\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.055202 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-combined-ca-bundle\") pod \"3cfad4a9-c499-491b-bc53-5346948e6e2a\" (UID: \"3cfad4a9-c499-491b-bc53-5346948e6e2a\") " Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.060491 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-scripts" (OuterVolumeSpecName: "scripts") pod "3cfad4a9-c499-491b-bc53-5346948e6e2a" (UID: "3cfad4a9-c499-491b-bc53-5346948e6e2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.061660 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cfad4a9-c499-491b-bc53-5346948e6e2a-kube-api-access-sclf5" (OuterVolumeSpecName: "kube-api-access-sclf5") pod "3cfad4a9-c499-491b-bc53-5346948e6e2a" (UID: "3cfad4a9-c499-491b-bc53-5346948e6e2a"). InnerVolumeSpecName "kube-api-access-sclf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.079616 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-config-data" (OuterVolumeSpecName: "config-data") pod "3cfad4a9-c499-491b-bc53-5346948e6e2a" (UID: "3cfad4a9-c499-491b-bc53-5346948e6e2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.094330 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cfad4a9-c499-491b-bc53-5346948e6e2a" (UID: "3cfad4a9-c499-491b-bc53-5346948e6e2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.159126 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.159321 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sclf5\" (UniqueName: \"kubernetes.io/projected/3cfad4a9-c499-491b-bc53-5346948e6e2a-kube-api-access-sclf5\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.159387 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.159440 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cfad4a9-c499-491b-bc53-5346948e6e2a-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.301137 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 28 09:22:01 crc kubenswrapper[4687]: W0228 09:22:01.304716 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45dcf0c_b04a_4ae5_9488_2051b3ea91df.slice/crio-7365111cffef1ab394a1be3c70b6ee324c407680ee48f819ee383c0681423333 WatchSource:0}: Error finding container 7365111cffef1ab394a1be3c70b6ee324c407680ee48f819ee383c0681423333: Status 404 returned error can't find the container with id 7365111cffef1ab394a1be3c70b6ee324c407680ee48f819ee383c0681423333 Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.502504 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" event={"ID":"70abdfed-0686-450a-b900-2eda9b68cec7","Type":"ContainerStarted","Data":"7207360c9276bcd57d8172654884778be47d9e32a5b1bafb90c3da4f725bf2ef"} Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.504673 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-trkpz" event={"ID":"3cfad4a9-c499-491b-bc53-5346948e6e2a","Type":"ContainerDied","Data":"13f245a9ccbc4740a4d7346d199142e44ff4df15fff7518c2804211936aec985"} Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.504704 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-trkpz" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.504717 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13f245a9ccbc4740a4d7346d199142e44ff4df15fff7518c2804211936aec985" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.514633 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e45dcf0c-b04a-4ae5-9488-2051b3ea91df","Type":"ContainerStarted","Data":"6159e99750325a2fa7b77a22d90fa8c7fef7e5f698c2910b663c2b0a1b478169"} Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.514709 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e45dcf0c-b04a-4ae5-9488-2051b3ea91df","Type":"ContainerStarted","Data":"7365111cffef1ab394a1be3c70b6ee324c407680ee48f819ee383c0681423333"} Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.514780 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.534437 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.534413676 podStartE2EDuration="1.534413676s" podCreationTimestamp="2026-02-28 09:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:01.527333342 +0000 UTC m=+1113.217902679" watchObservedRunningTime="2026-02-28 09:22:01.534413676 +0000 UTC m=+1113.224983013" Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.651837 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.652198 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-log" containerID="cri-o://7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76" gracePeriod=30 Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.652336 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-api" containerID="cri-o://85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc" gracePeriod=30 Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.667386 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.667607 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5129f869-98af-4a0f-ae3a-c3ac815078bc" containerName="nova-scheduler-scheduler" containerID="cri-o://cbe6bcb4183df4aff06fa3e028510922ed8a7f4b6b2caf136db98f99b91ccd73" gracePeriod=30 Feb 28 09:22:01 crc kubenswrapper[4687]: I0228 09:22:01.701453 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.246122 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.393225 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-config-data\") pod \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.393296 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/c9ce0b5b-0146-4809-9f5c-4e5547929f28-kube-api-access-xc6w9\") pod \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.393385 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9ce0b5b-0146-4809-9f5c-4e5547929f28-logs\") pod \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.393562 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-combined-ca-bundle\") pod \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\" (UID: \"c9ce0b5b-0146-4809-9f5c-4e5547929f28\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.393996 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ce0b5b-0146-4809-9f5c-4e5547929f28-logs" (OuterVolumeSpecName: "logs") pod "c9ce0b5b-0146-4809-9f5c-4e5547929f28" (UID: "c9ce0b5b-0146-4809-9f5c-4e5547929f28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.399098 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ce0b5b-0146-4809-9f5c-4e5547929f28-kube-api-access-xc6w9" (OuterVolumeSpecName: "kube-api-access-xc6w9") pod "c9ce0b5b-0146-4809-9f5c-4e5547929f28" (UID: "c9ce0b5b-0146-4809-9f5c-4e5547929f28"). InnerVolumeSpecName "kube-api-access-xc6w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.417814 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9ce0b5b-0146-4809-9f5c-4e5547929f28" (UID: "c9ce0b5b-0146-4809-9f5c-4e5547929f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.420464 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-config-data" (OuterVolumeSpecName: "config-data") pod "c9ce0b5b-0146-4809-9f5c-4e5547929f28" (UID: "c9ce0b5b-0146-4809-9f5c-4e5547929f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.496311 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.496572 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9ce0b5b-0146-4809-9f5c-4e5547929f28-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.496582 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/c9ce0b5b-0146-4809-9f5c-4e5547929f28-kube-api-access-xc6w9\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.496595 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9ce0b5b-0146-4809-9f5c-4e5547929f28-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.524518 4687 generic.go:334] "Generic (PLEG): container finished" podID="5129f869-98af-4a0f-ae3a-c3ac815078bc" containerID="cbe6bcb4183df4aff06fa3e028510922ed8a7f4b6b2caf136db98f99b91ccd73" exitCode=0 Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.524592 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5129f869-98af-4a0f-ae3a-c3ac815078bc","Type":"ContainerDied","Data":"cbe6bcb4183df4aff06fa3e028510922ed8a7f4b6b2caf136db98f99b91ccd73"} Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.526840 4687 generic.go:334] "Generic (PLEG): container finished" podID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerID="85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc" exitCode=0 Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.526861 4687 generic.go:334] "Generic (PLEG): container finished" podID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerID="7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76" exitCode=143 Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.526891 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9ce0b5b-0146-4809-9f5c-4e5547929f28","Type":"ContainerDied","Data":"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc"} Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.526907 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9ce0b5b-0146-4809-9f5c-4e5547929f28","Type":"ContainerDied","Data":"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76"} Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.526918 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9ce0b5b-0146-4809-9f5c-4e5547929f28","Type":"ContainerDied","Data":"40ab86a451d0a40dad199d7f5907caee0533cce208161145e08148ebe0ba2132"} Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.526937 4687 scope.go:117] "RemoveContainer" containerID="85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.527085 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.535082 4687 generic.go:334] "Generic (PLEG): container finished" podID="70abdfed-0686-450a-b900-2eda9b68cec7" containerID="37a55319d850b5712035136020c7276544e69623d55169e9d4ad009f9f951568" exitCode=0 Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.535243 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" event={"ID":"70abdfed-0686-450a-b900-2eda9b68cec7","Type":"ContainerDied","Data":"37a55319d850b5712035136020c7276544e69623d55169e9d4ad009f9f951568"} Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.535653 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-log" containerID="cri-o://6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110" gracePeriod=30 Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.535831 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-metadata" containerID="cri-o://85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d" gracePeriod=30 Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.624256 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.632600 4687 scope.go:117] "RemoveContainer" containerID="7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.640469 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.653966 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.677150 4687 scope.go:117] "RemoveContainer" containerID="85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc" Feb 28 09:22:02 crc kubenswrapper[4687]: E0228 09:22:02.677468 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc\": container with ID starting with 85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc not found: ID does not exist" containerID="85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.677505 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc"} err="failed to get container status \"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc\": rpc error: code = NotFound desc = could not find container \"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc\": container with ID starting with 85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc not found: ID does not exist" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.677527 4687 scope.go:117] "RemoveContainer" containerID="7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76" Feb 28 09:22:02 crc kubenswrapper[4687]: E0228 09:22:02.677786 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76\": container with ID starting with 7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76 not found: ID does not exist" containerID="7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.677847 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76"} err="failed to get container status \"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76\": rpc error: code = NotFound desc = could not find container \"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76\": container with ID starting with 7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76 not found: ID does not exist" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.677876 4687 scope.go:117] "RemoveContainer" containerID="85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.681350 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc"} err="failed to get container status \"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc\": rpc error: code = NotFound desc = could not find container \"85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc\": container with ID starting with 85e3f5794b8528f2326c40c80389157ae2d8c4b14f24b09d03bd66f246d8e3dc not found: ID does not exist" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.681391 4687 scope.go:117] "RemoveContainer" containerID="7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.681718 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76"} err="failed to get container status \"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76\": rpc error: code = NotFound desc = could not find container \"7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76\": container with ID starting with 7a779fabce6b4450ddb7468f42bf81556d5d8f5d7ac01cf3c97a12a55a0c6c76 not found: ID does not exist" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.683766 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" path="/var/lib/kubelet/pods/c9ce0b5b-0146-4809-9f5c-4e5547929f28/volumes" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.686521 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:02 crc kubenswrapper[4687]: E0228 09:22:02.686974 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5129f869-98af-4a0f-ae3a-c3ac815078bc" containerName="nova-scheduler-scheduler" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.687107 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="5129f869-98af-4a0f-ae3a-c3ac815078bc" containerName="nova-scheduler-scheduler" Feb 28 09:22:02 crc kubenswrapper[4687]: E0228 09:22:02.687168 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-api" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.687227 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-api" Feb 28 09:22:02 crc kubenswrapper[4687]: E0228 09:22:02.687295 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-log" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.687765 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-log" Feb 28 09:22:02 crc kubenswrapper[4687]: E0228 09:22:02.687841 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cfad4a9-c499-491b-bc53-5346948e6e2a" containerName="nova-manage" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.687888 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cfad4a9-c499-491b-bc53-5346948e6e2a" containerName="nova-manage" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.688211 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="5129f869-98af-4a0f-ae3a-c3ac815078bc" containerName="nova-scheduler-scheduler" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.688306 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-log" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.688358 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ce0b5b-0146-4809-9f5c-4e5547929f28" containerName="nova-api-api" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.688420 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cfad4a9-c499-491b-bc53-5346948e6e2a" containerName="nova-manage" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.690941 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.693605 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.702097 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-config-data\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.702195 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001b7c85-0b9f-4fdb-83b7-687c36587331-logs\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.702226 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54hqp\" (UniqueName: \"kubernetes.io/projected/001b7c85-0b9f-4fdb-83b7-687c36587331-kube-api-access-54hqp\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.702616 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.703305 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.809193 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfwdw\" (UniqueName: \"kubernetes.io/projected/5129f869-98af-4a0f-ae3a-c3ac815078bc-kube-api-access-vfwdw\") pod \"5129f869-98af-4a0f-ae3a-c3ac815078bc\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.809492 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-config-data\") pod \"5129f869-98af-4a0f-ae3a-c3ac815078bc\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.809605 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-combined-ca-bundle\") pod \"5129f869-98af-4a0f-ae3a-c3ac815078bc\" (UID: \"5129f869-98af-4a0f-ae3a-c3ac815078bc\") " Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.809974 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-config-data\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.810030 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001b7c85-0b9f-4fdb-83b7-687c36587331-logs\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.810055 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54hqp\" (UniqueName: \"kubernetes.io/projected/001b7c85-0b9f-4fdb-83b7-687c36587331-kube-api-access-54hqp\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.810552 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.811046 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001b7c85-0b9f-4fdb-83b7-687c36587331-logs\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.815512 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5129f869-98af-4a0f-ae3a-c3ac815078bc-kube-api-access-vfwdw" (OuterVolumeSpecName: "kube-api-access-vfwdw") pod "5129f869-98af-4a0f-ae3a-c3ac815078bc" (UID: "5129f869-98af-4a0f-ae3a-c3ac815078bc"). InnerVolumeSpecName "kube-api-access-vfwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.816792 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.832283 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-config-data\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.835535 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54hqp\" (UniqueName: \"kubernetes.io/projected/001b7c85-0b9f-4fdb-83b7-687c36587331-kube-api-access-54hqp\") pod \"nova-api-0\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " pod="openstack/nova-api-0" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.864680 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-config-data" (OuterVolumeSpecName: "config-data") pod "5129f869-98af-4a0f-ae3a-c3ac815078bc" (UID: "5129f869-98af-4a0f-ae3a-c3ac815078bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.865640 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5129f869-98af-4a0f-ae3a-c3ac815078bc" (UID: "5129f869-98af-4a0f-ae3a-c3ac815078bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.878232 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.914083 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.914120 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5129f869-98af-4a0f-ae3a-c3ac815078bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.914133 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfwdw\" (UniqueName: \"kubernetes.io/projected/5129f869-98af-4a0f-ae3a-c3ac815078bc-kube-api-access-vfwdw\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.935064 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-5h7z4"] Feb 28 09:22:02 crc kubenswrapper[4687]: I0228 09:22:02.935297 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" podUID="1af91582-eba0-43db-8e20-00caea60a31a" containerName="dnsmasq-dns" containerID="cri-o://6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99" gracePeriod=10 Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.030524 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.102299 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.116535 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-config-data\") pod \"54619cb5-80eb-4995-99b7-fdd217f640f9\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.117101 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdj4j\" (UniqueName: \"kubernetes.io/projected/54619cb5-80eb-4995-99b7-fdd217f640f9-kube-api-access-zdj4j\") pod \"54619cb5-80eb-4995-99b7-fdd217f640f9\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.117234 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-combined-ca-bundle\") pod \"54619cb5-80eb-4995-99b7-fdd217f640f9\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.117329 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54619cb5-80eb-4995-99b7-fdd217f640f9-logs\") pod \"54619cb5-80eb-4995-99b7-fdd217f640f9\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.117405 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-nova-metadata-tls-certs\") pod \"54619cb5-80eb-4995-99b7-fdd217f640f9\" (UID: \"54619cb5-80eb-4995-99b7-fdd217f640f9\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.118598 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54619cb5-80eb-4995-99b7-fdd217f640f9-logs" (OuterVolumeSpecName: "logs") pod "54619cb5-80eb-4995-99b7-fdd217f640f9" (UID: "54619cb5-80eb-4995-99b7-fdd217f640f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.125638 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54619cb5-80eb-4995-99b7-fdd217f640f9-kube-api-access-zdj4j" (OuterVolumeSpecName: "kube-api-access-zdj4j") pod "54619cb5-80eb-4995-99b7-fdd217f640f9" (UID: "54619cb5-80eb-4995-99b7-fdd217f640f9"). InnerVolumeSpecName "kube-api-access-zdj4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.168594 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54619cb5-80eb-4995-99b7-fdd217f640f9" (UID: "54619cb5-80eb-4995-99b7-fdd217f640f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.186489 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-config-data" (OuterVolumeSpecName: "config-data") pod "54619cb5-80eb-4995-99b7-fdd217f640f9" (UID: "54619cb5-80eb-4995-99b7-fdd217f640f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.189625 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "54619cb5-80eb-4995-99b7-fdd217f640f9" (UID: "54619cb5-80eb-4995-99b7-fdd217f640f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.220675 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54619cb5-80eb-4995-99b7-fdd217f640f9-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.220710 4687 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.220723 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.220732 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdj4j\" (UniqueName: \"kubernetes.io/projected/54619cb5-80eb-4995-99b7-fdd217f640f9-kube-api-access-zdj4j\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.220742 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54619cb5-80eb-4995-99b7-fdd217f640f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.308571 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.322631 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-svc\") pod \"1af91582-eba0-43db-8e20-00caea60a31a\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.322683 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-nb\") pod \"1af91582-eba0-43db-8e20-00caea60a31a\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.322726 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-swift-storage-0\") pod \"1af91582-eba0-43db-8e20-00caea60a31a\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.322786 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-config\") pod \"1af91582-eba0-43db-8e20-00caea60a31a\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.322843 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-sb\") pod \"1af91582-eba0-43db-8e20-00caea60a31a\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.322911 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh7xs\" (UniqueName: \"kubernetes.io/projected/1af91582-eba0-43db-8e20-00caea60a31a-kube-api-access-zh7xs\") pod \"1af91582-eba0-43db-8e20-00caea60a31a\" (UID: \"1af91582-eba0-43db-8e20-00caea60a31a\") " Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.329683 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af91582-eba0-43db-8e20-00caea60a31a-kube-api-access-zh7xs" (OuterVolumeSpecName: "kube-api-access-zh7xs") pod "1af91582-eba0-43db-8e20-00caea60a31a" (UID: "1af91582-eba0-43db-8e20-00caea60a31a"). InnerVolumeSpecName "kube-api-access-zh7xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.358688 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-config" (OuterVolumeSpecName: "config") pod "1af91582-eba0-43db-8e20-00caea60a31a" (UID: "1af91582-eba0-43db-8e20-00caea60a31a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.360889 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1af91582-eba0-43db-8e20-00caea60a31a" (UID: "1af91582-eba0-43db-8e20-00caea60a31a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.363174 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1af91582-eba0-43db-8e20-00caea60a31a" (UID: "1af91582-eba0-43db-8e20-00caea60a31a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.363687 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1af91582-eba0-43db-8e20-00caea60a31a" (UID: "1af91582-eba0-43db-8e20-00caea60a31a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.365905 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1af91582-eba0-43db-8e20-00caea60a31a" (UID: "1af91582-eba0-43db-8e20-00caea60a31a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.424959 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.424987 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.424998 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.425007 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.425015 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1af91582-eba0-43db-8e20-00caea60a31a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.425041 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh7xs\" (UniqueName: \"kubernetes.io/projected/1af91582-eba0-43db-8e20-00caea60a31a-kube-api-access-zh7xs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.545161 4687 generic.go:334] "Generic (PLEG): container finished" podID="1af91582-eba0-43db-8e20-00caea60a31a" containerID="6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99" exitCode=0 Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.545256 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" event={"ID":"1af91582-eba0-43db-8e20-00caea60a31a","Type":"ContainerDied","Data":"6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99"} Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.545281 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.545321 4687 scope.go:117] "RemoveContainer" containerID="6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.545290 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-5h7z4" event={"ID":"1af91582-eba0-43db-8e20-00caea60a31a","Type":"ContainerDied","Data":"28cc9f792b72eebf39235a0c18c3bd2f077465c537ca68227c3a35ceea3b9b29"} Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.549474 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5129f869-98af-4a0f-ae3a-c3ac815078bc","Type":"ContainerDied","Data":"36af6469ceb40ac6db878ea8585cd9fe2e75e95da8c3c4f0e9e8fc191e6da661"} Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.549561 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.559856 4687 generic.go:334] "Generic (PLEG): container finished" podID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerID="85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d" exitCode=0 Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.559890 4687 generic.go:334] "Generic (PLEG): container finished" podID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerID="6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110" exitCode=143 Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.560112 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.561180 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54619cb5-80eb-4995-99b7-fdd217f640f9","Type":"ContainerDied","Data":"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d"} Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.561240 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54619cb5-80eb-4995-99b7-fdd217f640f9","Type":"ContainerDied","Data":"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110"} Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.561252 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54619cb5-80eb-4995-99b7-fdd217f640f9","Type":"ContainerDied","Data":"4e007bf648202f001aae15e7f346d1c2d1be66134b27b41a056c25b6dc581578"} Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.591477 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: W0228 09:22:03.594256 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001b7c85_0b9f_4fdb_83b7_687c36587331.slice/crio-c389697740556c3634fb7f7a53ee374b827abbe5962839c5cf201bd650f0424a WatchSource:0}: Error finding container c389697740556c3634fb7f7a53ee374b827abbe5962839c5cf201bd650f0424a: Status 404 returned error can't find the container with id c389697740556c3634fb7f7a53ee374b827abbe5962839c5cf201bd650f0424a Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.693687 4687 scope.go:117] "RemoveContainer" containerID="1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.741320 4687 scope.go:117] "RemoveContainer" containerID="6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.742080 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99\": container with ID starting with 6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99 not found: ID does not exist" containerID="6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.742143 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99"} err="failed to get container status \"6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99\": rpc error: code = NotFound desc = could not find container \"6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99\": container with ID starting with 6fdce1c4712b81e3a7f2c0f2f5d350742b7afa3b137795c3f6466d725d830c99 not found: ID does not exist" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.742189 4687 scope.go:117] "RemoveContainer" containerID="1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.763319 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec\": container with ID starting with 1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec not found: ID does not exist" containerID="1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.763606 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec"} err="failed to get container status \"1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec\": rpc error: code = NotFound desc = could not find container \"1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec\": container with ID starting with 1059573a418168df4350af102eafdba6bdc26f0c68f02d94fe7749c4a80a11ec not found: ID does not exist" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.763633 4687 scope.go:117] "RemoveContainer" containerID="cbe6bcb4183df4aff06fa3e028510922ed8a7f4b6b2caf136db98f99b91ccd73" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.776905 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-5h7z4"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.792983 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-5h7z4"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.802222 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.818047 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.818718 4687 scope.go:117] "RemoveContainer" containerID="85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.828138 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.828654 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-metadata" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.828672 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-metadata" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.828700 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af91582-eba0-43db-8e20-00caea60a31a" containerName="init" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.828710 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af91582-eba0-43db-8e20-00caea60a31a" containerName="init" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.828726 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-log" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.828735 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-log" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.828749 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af91582-eba0-43db-8e20-00caea60a31a" containerName="dnsmasq-dns" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.828756 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af91582-eba0-43db-8e20-00caea60a31a" containerName="dnsmasq-dns" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.828991 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-metadata" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.829015 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" containerName="nova-metadata-log" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.829038 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af91582-eba0-43db-8e20-00caea60a31a" containerName="dnsmasq-dns" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.829748 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.834082 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.834996 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.861091 4687 scope.go:117] "RemoveContainer" containerID="6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.861337 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfdd\" (UniqueName: \"kubernetes.io/projected/542410c5-adcf-424e-966c-9c919abe28fc-kube-api-access-srfdd\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.861464 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-config-data\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.861687 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.863692 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.871372 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.884557 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.885606 4687 scope.go:117] "RemoveContainer" containerID="85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.886001 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d\": container with ID starting with 85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d not found: ID does not exist" containerID="85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.886056 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d"} err="failed to get container status \"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d\": rpc error: code = NotFound desc = could not find container \"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d\": container with ID starting with 85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d not found: ID does not exist" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.886089 4687 scope.go:117] "RemoveContainer" containerID="6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110" Feb 28 09:22:03 crc kubenswrapper[4687]: E0228 09:22:03.886576 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110\": container with ID starting with 6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110 not found: ID does not exist" containerID="6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.886625 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110"} err="failed to get container status \"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110\": rpc error: code = NotFound desc = could not find container \"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110\": container with ID starting with 6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110 not found: ID does not exist" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.886651 4687 scope.go:117] "RemoveContainer" containerID="85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.886809 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.887665 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d"} err="failed to get container status \"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d\": rpc error: code = NotFound desc = could not find container \"85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d\": container with ID starting with 85b0f79588c173b4ba32f3ac552e441dde78210a5566f4fdcc4479c5aa22ed0d not found: ID does not exist" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.887702 4687 scope.go:117] "RemoveContainer" containerID="6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.895723 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.897285 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110"} err="failed to get container status \"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110\": rpc error: code = NotFound desc = could not find container \"6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110\": container with ID starting with 6b15094836e7b3d2648265271b5a760db1f6a40f1ce60aa76217779b29e79110 not found: ID does not exist" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.918713 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.921572 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.929630 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.948426 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.965614 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srfdd\" (UniqueName: \"kubernetes.io/projected/542410c5-adcf-424e-966c-9c919abe28fc-kube-api-access-srfdd\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.965714 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-config-data\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.965793 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.969427 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.971157 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-config-data\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:03 crc kubenswrapper[4687]: I0228 09:22:03.979247 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srfdd\" (UniqueName: \"kubernetes.io/projected/542410c5-adcf-424e-966c-9c919abe28fc-kube-api-access-srfdd\") pod \"nova-scheduler-0\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.067418 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62t6n\" (UniqueName: \"kubernetes.io/projected/70abdfed-0686-450a-b900-2eda9b68cec7-kube-api-access-62t6n\") pod \"70abdfed-0686-450a-b900-2eda9b68cec7\" (UID: \"70abdfed-0686-450a-b900-2eda9b68cec7\") " Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.067850 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.068101 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aac36df9-d0ba-430a-9d78-ed35d6f0723a-logs\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.068155 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.068212 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-config-data\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.068255 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsr4d\" (UniqueName: \"kubernetes.io/projected/aac36df9-d0ba-430a-9d78-ed35d6f0723a-kube-api-access-jsr4d\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.071924 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70abdfed-0686-450a-b900-2eda9b68cec7-kube-api-access-62t6n" (OuterVolumeSpecName: "kube-api-access-62t6n") pod "70abdfed-0686-450a-b900-2eda9b68cec7" (UID: "70abdfed-0686-450a-b900-2eda9b68cec7"). InnerVolumeSpecName "kube-api-access-62t6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.161769 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.171213 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aac36df9-d0ba-430a-9d78-ed35d6f0723a-logs\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.171283 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.171342 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-config-data\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.171387 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsr4d\" (UniqueName: \"kubernetes.io/projected/aac36df9-d0ba-430a-9d78-ed35d6f0723a-kube-api-access-jsr4d\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.171439 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.171555 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62t6n\" (UniqueName: \"kubernetes.io/projected/70abdfed-0686-450a-b900-2eda9b68cec7-kube-api-access-62t6n\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.172237 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aac36df9-d0ba-430a-9d78-ed35d6f0723a-logs\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.175795 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-config-data\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.176464 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.176580 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.188155 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsr4d\" (UniqueName: \"kubernetes.io/projected/aac36df9-d0ba-430a-9d78-ed35d6f0723a-kube-api-access-jsr4d\") pod \"nova-metadata-0\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.238742 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.575049 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"001b7c85-0b9f-4fdb-83b7-687c36587331","Type":"ContainerStarted","Data":"3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a"} Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.575318 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"001b7c85-0b9f-4fdb-83b7-687c36587331","Type":"ContainerStarted","Data":"81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1"} Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.575331 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"001b7c85-0b9f-4fdb-83b7-687c36587331","Type":"ContainerStarted","Data":"c389697740556c3634fb7f7a53ee374b827abbe5962839c5cf201bd650f0424a"} Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.582787 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" event={"ID":"70abdfed-0686-450a-b900-2eda9b68cec7","Type":"ContainerDied","Data":"7207360c9276bcd57d8172654884778be47d9e32a5b1bafb90c3da4f725bf2ef"} Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.582831 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7207360c9276bcd57d8172654884778be47d9e32a5b1bafb90c3da4f725bf2ef" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.582865 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.599156 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.601178 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.601159092 podStartE2EDuration="2.601159092s" podCreationTimestamp="2026-02-28 09:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:04.59262829 +0000 UTC m=+1116.283197637" watchObservedRunningTime="2026-02-28 09:22:04.601159092 +0000 UTC m=+1116.291728429" Feb 28 09:22:04 crc kubenswrapper[4687]: W0228 09:22:04.665270 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaac36df9_d0ba_430a_9d78_ed35d6f0723a.slice/crio-0d2a79e172ba2e22733a0edfe9bbc4f17fbe1d30f22f5143cb2725c1ebec623d WatchSource:0}: Error finding container 0d2a79e172ba2e22733a0edfe9bbc4f17fbe1d30f22f5143cb2725c1ebec623d: Status 404 returned error can't find the container with id 0d2a79e172ba2e22733a0edfe9bbc4f17fbe1d30f22f5143cb2725c1ebec623d Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.668949 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af91582-eba0-43db-8e20-00caea60a31a" path="/var/lib/kubelet/pods/1af91582-eba0-43db-8e20-00caea60a31a/volumes" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.669700 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5129f869-98af-4a0f-ae3a-c3ac815078bc" path="/var/lib/kubelet/pods/5129f869-98af-4a0f-ae3a-c3ac815078bc/volumes" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.670273 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54619cb5-80eb-4995-99b7-fdd217f640f9" path="/var/lib/kubelet/pods/54619cb5-80eb-4995-99b7-fdd217f640f9/volumes" Feb 28 09:22:04 crc kubenswrapper[4687]: I0228 09:22:04.671512 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.014673 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537836-jgbxm"] Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.027752 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537836-jgbxm"] Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.594136 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aac36df9-d0ba-430a-9d78-ed35d6f0723a","Type":"ContainerStarted","Data":"9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0"} Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.594194 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aac36df9-d0ba-430a-9d78-ed35d6f0723a","Type":"ContainerStarted","Data":"13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11"} Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.594212 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aac36df9-d0ba-430a-9d78-ed35d6f0723a","Type":"ContainerStarted","Data":"0d2a79e172ba2e22733a0edfe9bbc4f17fbe1d30f22f5143cb2725c1ebec623d"} Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.595826 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"542410c5-adcf-424e-966c-9c919abe28fc","Type":"ContainerStarted","Data":"ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883"} Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.595884 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"542410c5-adcf-424e-966c-9c919abe28fc","Type":"ContainerStarted","Data":"7dabb298c5e55a6cb6d94df94eb43d8f3a6d40b4bb7c8b5719c82e5e0f7563c4"} Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.623723 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.623702258 podStartE2EDuration="2.623702258s" podCreationTimestamp="2026-02-28 09:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:05.612045308 +0000 UTC m=+1117.302614645" watchObservedRunningTime="2026-02-28 09:22:05.623702258 +0000 UTC m=+1117.314271595" Feb 28 09:22:05 crc kubenswrapper[4687]: I0228 09:22:05.637882 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.637853217 podStartE2EDuration="2.637853217s" podCreationTimestamp="2026-02-28 09:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:05.632037121 +0000 UTC m=+1117.322606457" watchObservedRunningTime="2026-02-28 09:22:05.637853217 +0000 UTC m=+1117.328422554" Feb 28 09:22:06 crc kubenswrapper[4687]: I0228 09:22:06.683042 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a" path="/var/lib/kubelet/pods/cdcf2df7-2440-48d7-ab5c-8ffacc7bdd5a/volumes" Feb 28 09:22:08 crc kubenswrapper[4687]: I0228 09:22:08.928727 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:22:08 crc kubenswrapper[4687]: I0228 09:22:08.949771 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:22:09 crc kubenswrapper[4687]: I0228 09:22:09.161849 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 28 09:22:09 crc kubenswrapper[4687]: I0228 09:22:09.240258 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 28 09:22:09 crc kubenswrapper[4687]: I0228 09:22:09.240302 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 28 09:22:09 crc kubenswrapper[4687]: I0228 09:22:09.647440 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 28 09:22:10 crc kubenswrapper[4687]: I0228 09:22:10.935178 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.324203 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vw96t"] Feb 28 09:22:11 crc kubenswrapper[4687]: E0228 09:22:11.325221 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70abdfed-0686-450a-b900-2eda9b68cec7" containerName="oc" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.325245 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="70abdfed-0686-450a-b900-2eda9b68cec7" containerName="oc" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.325471 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="70abdfed-0686-450a-b900-2eda9b68cec7" containerName="oc" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.326096 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.327884 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.327976 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.332857 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vw96t"] Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.411554 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvqd\" (UniqueName: \"kubernetes.io/projected/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-kube-api-access-rgvqd\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.411758 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-scripts\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.411778 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.411817 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-config-data\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.514130 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvqd\" (UniqueName: \"kubernetes.io/projected/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-kube-api-access-rgvqd\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.514239 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-scripts\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.514291 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.515103 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-config-data\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.520895 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-scripts\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.521100 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-config-data\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.521887 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.532535 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvqd\" (UniqueName: \"kubernetes.io/projected/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-kube-api-access-rgvqd\") pod \"nova-cell1-cell-mapping-vw96t\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:11 crc kubenswrapper[4687]: I0228 09:22:11.645299 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:12 crc kubenswrapper[4687]: I0228 09:22:12.051559 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vw96t"] Feb 28 09:22:12 crc kubenswrapper[4687]: I0228 09:22:12.607858 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 28 09:22:12 crc kubenswrapper[4687]: I0228 09:22:12.666663 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vw96t" event={"ID":"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7","Type":"ContainerStarted","Data":"71a5cc66932b3a81fc5c97753c47d9b6ae7801ae91551b9f6c1b5f925bc09223"} Feb 28 09:22:12 crc kubenswrapper[4687]: I0228 09:22:12.666719 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vw96t" event={"ID":"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7","Type":"ContainerStarted","Data":"3bd4d2536ac79ee3c5e0e0484ef6a0d9b3e11339701f87306176c423f9f16c77"} Feb 28 09:22:12 crc kubenswrapper[4687]: I0228 09:22:12.689997 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vw96t" podStartSLOduration=1.6899781360000001 podStartE2EDuration="1.689978136s" podCreationTimestamp="2026-02-28 09:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:12.682582149 +0000 UTC m=+1124.373151487" watchObservedRunningTime="2026-02-28 09:22:12.689978136 +0000 UTC m=+1124.380547474" Feb 28 09:22:13 crc kubenswrapper[4687]: I0228 09:22:13.102993 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 28 09:22:13 crc kubenswrapper[4687]: I0228 09:22:13.103272 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.162368 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.185183 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.185222 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.192987 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.240446 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.240512 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 28 09:22:14 crc kubenswrapper[4687]: I0228 09:22:14.723507 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 28 09:22:15 crc kubenswrapper[4687]: I0228 09:22:15.256224 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:15 crc kubenswrapper[4687]: I0228 09:22:15.256285 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:15 crc kubenswrapper[4687]: I0228 09:22:15.732982 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:22:15 crc kubenswrapper[4687]: I0228 09:22:15.733216 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a75c27c0-aef6-4631-9a63-521ba7e5889c" containerName="kube-state-metrics" containerID="cri-o://61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9" gracePeriod=30 Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.181386 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.208513 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42x2f\" (UniqueName: \"kubernetes.io/projected/a75c27c0-aef6-4631-9a63-521ba7e5889c-kube-api-access-42x2f\") pod \"a75c27c0-aef6-4631-9a63-521ba7e5889c\" (UID: \"a75c27c0-aef6-4631-9a63-521ba7e5889c\") " Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.215820 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75c27c0-aef6-4631-9a63-521ba7e5889c-kube-api-access-42x2f" (OuterVolumeSpecName: "kube-api-access-42x2f") pod "a75c27c0-aef6-4631-9a63-521ba7e5889c" (UID: "a75c27c0-aef6-4631-9a63-521ba7e5889c"). InnerVolumeSpecName "kube-api-access-42x2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.311502 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42x2f\" (UniqueName: \"kubernetes.io/projected/a75c27c0-aef6-4631-9a63-521ba7e5889c-kube-api-access-42x2f\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.726115 4687 generic.go:334] "Generic (PLEG): container finished" podID="1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" containerID="71a5cc66932b3a81fc5c97753c47d9b6ae7801ae91551b9f6c1b5f925bc09223" exitCode=0 Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.726203 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vw96t" event={"ID":"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7","Type":"ContainerDied","Data":"71a5cc66932b3a81fc5c97753c47d9b6ae7801ae91551b9f6c1b5f925bc09223"} Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.729429 4687 generic.go:334] "Generic (PLEG): container finished" podID="a75c27c0-aef6-4631-9a63-521ba7e5889c" containerID="61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9" exitCode=2 Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.729496 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a75c27c0-aef6-4631-9a63-521ba7e5889c","Type":"ContainerDied","Data":"61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9"} Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.729542 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a75c27c0-aef6-4631-9a63-521ba7e5889c","Type":"ContainerDied","Data":"b68ad094ec81285cd9113114d8962e1bc59b32b47951cc5e3ff499bdfc4fb5fc"} Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.729539 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.729567 4687 scope.go:117] "RemoveContainer" containerID="61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.758638 4687 scope.go:117] "RemoveContainer" containerID="61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9" Feb 28 09:22:16 crc kubenswrapper[4687]: E0228 09:22:16.759102 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9\": container with ID starting with 61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9 not found: ID does not exist" containerID="61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.759236 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9"} err="failed to get container status \"61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9\": rpc error: code = NotFound desc = could not find container \"61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9\": container with ID starting with 61f947a196578b0493138114a8386b30076ee899bcabd523245d293cd935fea9 not found: ID does not exist" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.769498 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.777235 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.782882 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:22:16 crc kubenswrapper[4687]: E0228 09:22:16.783413 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75c27c0-aef6-4631-9a63-521ba7e5889c" containerName="kube-state-metrics" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.783434 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75c27c0-aef6-4631-9a63-521ba7e5889c" containerName="kube-state-metrics" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.783658 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75c27c0-aef6-4631-9a63-521ba7e5889c" containerName="kube-state-metrics" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.784378 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.787267 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.788436 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.790941 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.829275 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.829390 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.829729 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.829780 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9785\" (UniqueName: \"kubernetes.io/projected/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-api-access-b9785\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.932460 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.932551 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.932594 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9785\" (UniqueName: \"kubernetes.io/projected/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-api-access-b9785\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.932664 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.937733 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.938410 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.938865 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ce0499-adfd-41cd-9f90-db487bc7c7a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:16 crc kubenswrapper[4687]: I0228 09:22:16.947414 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9785\" (UniqueName: \"kubernetes.io/projected/42ce0499-adfd-41cd-9f90-db487bc7c7a0-kube-api-access-b9785\") pod \"kube-state-metrics-0\" (UID: \"42ce0499-adfd-41cd-9f90-db487bc7c7a0\") " pod="openstack/kube-state-metrics-0" Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.103531 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.478723 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.479261 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-central-agent" containerID="cri-o://97cfe07edbfdd5c85516656057e74fb4dd55cc7612326f2c0fe3cea9c053f637" gracePeriod=30 Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.479298 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="proxy-httpd" containerID="cri-o://2b007bb8f25bd7f308fcc2510186221cc1c834a2339de937b245f522cf7ea33e" gracePeriod=30 Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.479339 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-notification-agent" containerID="cri-o://1b94007e9d5a85876a992b2c77fbaa06991dd9a33506f46b528ba91ae8fd3f61" gracePeriod=30 Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.479333 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="sg-core" containerID="cri-o://9b68d1b548344e31f0e0b0728417dbf05a30e763edfb1cf56588609c5b8bb6d1" gracePeriod=30 Feb 28 09:22:17 crc kubenswrapper[4687]: W0228 09:22:17.519190 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42ce0499_adfd_41cd_9f90_db487bc7c7a0.slice/crio-ecf59219f1191ff3715e3d6f83b5c86cd48aeb55b2633cc27a945cbb110f74c9 WatchSource:0}: Error finding container ecf59219f1191ff3715e3d6f83b5c86cd48aeb55b2633cc27a945cbb110f74c9: Status 404 returned error can't find the container with id ecf59219f1191ff3715e3d6f83b5c86cd48aeb55b2633cc27a945cbb110f74c9 Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.519663 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.522224 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.744562 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42ce0499-adfd-41cd-9f90-db487bc7c7a0","Type":"ContainerStarted","Data":"ecf59219f1191ff3715e3d6f83b5c86cd48aeb55b2633cc27a945cbb110f74c9"} Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.749186 4687 generic.go:334] "Generic (PLEG): container finished" podID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerID="2b007bb8f25bd7f308fcc2510186221cc1c834a2339de937b245f522cf7ea33e" exitCode=0 Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.749296 4687 generic.go:334] "Generic (PLEG): container finished" podID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerID="9b68d1b548344e31f0e0b0728417dbf05a30e763edfb1cf56588609c5b8bb6d1" exitCode=2 Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.749262 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerDied","Data":"2b007bb8f25bd7f308fcc2510186221cc1c834a2339de937b245f522cf7ea33e"} Feb 28 09:22:17 crc kubenswrapper[4687]: I0228 09:22:17.749640 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerDied","Data":"9b68d1b548344e31f0e0b0728417dbf05a30e763edfb1cf56588609c5b8bb6d1"} Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.049397 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.162035 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-scripts\") pod \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.162329 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-combined-ca-bundle\") pod \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.162363 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvqd\" (UniqueName: \"kubernetes.io/projected/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-kube-api-access-rgvqd\") pod \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.162474 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-config-data\") pod \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\" (UID: \"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7\") " Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.167751 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-scripts" (OuterVolumeSpecName: "scripts") pod "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" (UID: "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.167788 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-kube-api-access-rgvqd" (OuterVolumeSpecName: "kube-api-access-rgvqd") pod "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" (UID: "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7"). InnerVolumeSpecName "kube-api-access-rgvqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.185575 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" (UID: "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.189004 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-config-data" (OuterVolumeSpecName: "config-data") pod "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" (UID: "1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.265488 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.265528 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.265539 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.265560 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgvqd\" (UniqueName: \"kubernetes.io/projected/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7-kube-api-access-rgvqd\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.664849 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75c27c0-aef6-4631-9a63-521ba7e5889c" path="/var/lib/kubelet/pods/a75c27c0-aef6-4631-9a63-521ba7e5889c/volumes" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.762774 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42ce0499-adfd-41cd-9f90-db487bc7c7a0","Type":"ContainerStarted","Data":"636acfbda736d33d54805f4cb8920e268e1f40bd5e587937cad5a5f1966cfa8f"} Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.765656 4687 generic.go:334] "Generic (PLEG): container finished" podID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerID="97cfe07edbfdd5c85516656057e74fb4dd55cc7612326f2c0fe3cea9c053f637" exitCode=0 Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.765732 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerDied","Data":"97cfe07edbfdd5c85516656057e74fb4dd55cc7612326f2c0fe3cea9c053f637"} Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.769345 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vw96t" event={"ID":"1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7","Type":"ContainerDied","Data":"3bd4d2536ac79ee3c5e0e0484ef6a0d9b3e11339701f87306176c423f9f16c77"} Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.769375 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd4d2536ac79ee3c5e0e0484ef6a0d9b3e11339701f87306176c423f9f16c77" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.769388 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vw96t" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.791143 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.513904695 podStartE2EDuration="2.791124408s" podCreationTimestamp="2026-02-28 09:22:16 +0000 UTC" firstStartedPulling="2026-02-28 09:22:17.52198856 +0000 UTC m=+1129.212557897" lastFinishedPulling="2026-02-28 09:22:17.799208272 +0000 UTC m=+1129.489777610" observedRunningTime="2026-02-28 09:22:18.775893588 +0000 UTC m=+1130.466462925" watchObservedRunningTime="2026-02-28 09:22:18.791124408 +0000 UTC m=+1130.481693745" Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.934199 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.934487 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-log" containerID="cri-o://81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1" gracePeriod=30 Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.934647 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-api" containerID="cri-o://3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a" gracePeriod=30 Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.944941 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.945187 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="542410c5-adcf-424e-966c-9c919abe28fc" containerName="nova-scheduler-scheduler" containerID="cri-o://ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883" gracePeriod=30 Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.983400 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.983623 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-log" containerID="cri-o://13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11" gracePeriod=30 Feb 28 09:22:18 crc kubenswrapper[4687]: I0228 09:22:18.983780 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-metadata" containerID="cri-o://9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0" gracePeriod=30 Feb 28 09:22:19 crc kubenswrapper[4687]: E0228 09:22:19.163221 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 28 09:22:19 crc kubenswrapper[4687]: E0228 09:22:19.166353 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 28 09:22:19 crc kubenswrapper[4687]: E0228 09:22:19.167362 4687 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 28 09:22:19 crc kubenswrapper[4687]: E0228 09:22:19.167421 4687 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="542410c5-adcf-424e-966c-9c919abe28fc" containerName="nova-scheduler-scheduler" Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.779857 4687 generic.go:334] "Generic (PLEG): container finished" podID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerID="81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1" exitCode=143 Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.779955 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"001b7c85-0b9f-4fdb-83b7-687c36587331","Type":"ContainerDied","Data":"81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1"} Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.785196 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aac36df9-d0ba-430a-9d78-ed35d6f0723a","Type":"ContainerDied","Data":"13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11"} Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.785361 4687 generic.go:334] "Generic (PLEG): container finished" podID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerID="13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11" exitCode=143 Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.788973 4687 generic.go:334] "Generic (PLEG): container finished" podID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerID="1b94007e9d5a85876a992b2c77fbaa06991dd9a33506f46b528ba91ae8fd3f61" exitCode=0 Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.789062 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerDied","Data":"1b94007e9d5a85876a992b2c77fbaa06991dd9a33506f46b528ba91ae8fd3f61"} Feb 28 09:22:19 crc kubenswrapper[4687]: I0228 09:22:19.789360 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.003729 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105287 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt9bj\" (UniqueName: \"kubernetes.io/projected/77c646e3-3eb4-488f-b3ac-34feb004a255-kube-api-access-pt9bj\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105380 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-combined-ca-bundle\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105471 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-log-httpd\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105522 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-scripts\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105544 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-sg-core-conf-yaml\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105660 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-run-httpd\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.105744 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-config-data\") pod \"77c646e3-3eb4-488f-b3ac-34feb004a255\" (UID: \"77c646e3-3eb4-488f-b3ac-34feb004a255\") " Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.106058 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.106301 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.106698 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.106718 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77c646e3-3eb4-488f-b3ac-34feb004a255-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.110972 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-scripts" (OuterVolumeSpecName: "scripts") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.115181 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c646e3-3eb4-488f-b3ac-34feb004a255-kube-api-access-pt9bj" (OuterVolumeSpecName: "kube-api-access-pt9bj") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "kube-api-access-pt9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.133454 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.165153 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.186260 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-config-data" (OuterVolumeSpecName: "config-data") pod "77c646e3-3eb4-488f-b3ac-34feb004a255" (UID: "77c646e3-3eb4-488f-b3ac-34feb004a255"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.210383 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.210434 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt9bj\" (UniqueName: \"kubernetes.io/projected/77c646e3-3eb4-488f-b3ac-34feb004a255-kube-api-access-pt9bj\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.210448 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.210459 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.210468 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77c646e3-3eb4-488f-b3ac-34feb004a255-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.805550 4687 generic.go:334] "Generic (PLEG): container finished" podID="542410c5-adcf-424e-966c-9c919abe28fc" containerID="ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883" exitCode=0 Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.805627 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"542410c5-adcf-424e-966c-9c919abe28fc","Type":"ContainerDied","Data":"ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883"} Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.810402 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.811263 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77c646e3-3eb4-488f-b3ac-34feb004a255","Type":"ContainerDied","Data":"743be384731dcbe1912387bc5b787c933caca7cd4f1c8895eb6923869207e34e"} Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.811297 4687 scope.go:117] "RemoveContainer" containerID="2b007bb8f25bd7f308fcc2510186221cc1c834a2339de937b245f522cf7ea33e" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.861904 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.881846 4687 scope.go:117] "RemoveContainer" containerID="9b68d1b548344e31f0e0b0728417dbf05a30e763edfb1cf56588609c5b8bb6d1" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.885821 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.896121 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901411 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:20 crc kubenswrapper[4687]: E0228 09:22:20.901858 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="sg-core" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901878 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="sg-core" Feb 28 09:22:20 crc kubenswrapper[4687]: E0228 09:22:20.901889 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542410c5-adcf-424e-966c-9c919abe28fc" containerName="nova-scheduler-scheduler" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901896 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="542410c5-adcf-424e-966c-9c919abe28fc" containerName="nova-scheduler-scheduler" Feb 28 09:22:20 crc kubenswrapper[4687]: E0228 09:22:20.901918 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-central-agent" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901925 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-central-agent" Feb 28 09:22:20 crc kubenswrapper[4687]: E0228 09:22:20.901938 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="proxy-httpd" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901944 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="proxy-httpd" Feb 28 09:22:20 crc kubenswrapper[4687]: E0228 09:22:20.901964 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-notification-agent" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901971 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-notification-agent" Feb 28 09:22:20 crc kubenswrapper[4687]: E0228 09:22:20.901982 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" containerName="nova-manage" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.901988 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" containerName="nova-manage" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.902203 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" containerName="nova-manage" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.902216 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="sg-core" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.902224 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="542410c5-adcf-424e-966c-9c919abe28fc" containerName="nova-scheduler-scheduler" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.902235 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-notification-agent" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.902252 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="proxy-httpd" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.902261 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" containerName="ceilometer-central-agent" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.903897 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.905363 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.906011 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.908128 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.914379 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.934704 4687 scope.go:117] "RemoveContainer" containerID="1b94007e9d5a85876a992b2c77fbaa06991dd9a33506f46b528ba91ae8fd3f61" Feb 28 09:22:20 crc kubenswrapper[4687]: I0228 09:22:20.953763 4687 scope.go:117] "RemoveContainer" containerID="97cfe07edbfdd5c85516656057e74fb4dd55cc7612326f2c0fe3cea9c053f637" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.029309 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-combined-ca-bundle\") pod \"542410c5-adcf-424e-966c-9c919abe28fc\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.029385 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-config-data\") pod \"542410c5-adcf-424e-966c-9c919abe28fc\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.029633 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srfdd\" (UniqueName: \"kubernetes.io/projected/542410c5-adcf-424e-966c-9c919abe28fc-kube-api-access-srfdd\") pod \"542410c5-adcf-424e-966c-9c919abe28fc\" (UID: \"542410c5-adcf-424e-966c-9c919abe28fc\") " Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.030376 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b2tm\" (UniqueName: \"kubernetes.io/projected/e664c500-7796-4b13-b10b-f65aa311a2cd-kube-api-access-9b2tm\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.030443 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.030797 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-scripts\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.031378 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-log-httpd\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.031447 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.031553 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-run-httpd\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.031598 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.031753 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-config-data\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.036347 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542410c5-adcf-424e-966c-9c919abe28fc-kube-api-access-srfdd" (OuterVolumeSpecName: "kube-api-access-srfdd") pod "542410c5-adcf-424e-966c-9c919abe28fc" (UID: "542410c5-adcf-424e-966c-9c919abe28fc"). InnerVolumeSpecName "kube-api-access-srfdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.055316 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "542410c5-adcf-424e-966c-9c919abe28fc" (UID: "542410c5-adcf-424e-966c-9c919abe28fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.055709 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-config-data" (OuterVolumeSpecName: "config-data") pod "542410c5-adcf-424e-966c-9c919abe28fc" (UID: "542410c5-adcf-424e-966c-9c919abe28fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135349 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b2tm\" (UniqueName: \"kubernetes.io/projected/e664c500-7796-4b13-b10b-f65aa311a2cd-kube-api-access-9b2tm\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135416 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135495 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-scripts\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135549 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-log-httpd\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135571 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135591 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-run-httpd\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135610 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135649 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-config-data\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135746 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srfdd\" (UniqueName: \"kubernetes.io/projected/542410c5-adcf-424e-966c-9c919abe28fc-kube-api-access-srfdd\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135763 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.135778 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542410c5-adcf-424e-966c-9c919abe28fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.136083 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-log-httpd\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.136408 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-run-httpd\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.139581 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.139590 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.141582 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-config-data\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.142439 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-scripts\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.144595 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.151389 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b2tm\" (UniqueName: \"kubernetes.io/projected/e664c500-7796-4b13-b10b-f65aa311a2cd-kube-api-access-9b2tm\") pod \"ceilometer-0\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.225829 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:21 crc kubenswrapper[4687]: W0228 09:22:21.665557 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode664c500_7796_4b13_b10b_f65aa311a2cd.slice/crio-521d04bc89081492eb783c4a5eecb0ebbd05faf0c47ed6a43e0516b78d2b1e69 WatchSource:0}: Error finding container 521d04bc89081492eb783c4a5eecb0ebbd05faf0c47ed6a43e0516b78d2b1e69: Status 404 returned error can't find the container with id 521d04bc89081492eb783c4a5eecb0ebbd05faf0c47ed6a43e0516b78d2b1e69 Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.668145 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.823656 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"542410c5-adcf-424e-966c-9c919abe28fc","Type":"ContainerDied","Data":"7dabb298c5e55a6cb6d94df94eb43d8f3a6d40b4bb7c8b5719c82e5e0f7563c4"} Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.823702 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.823746 4687 scope.go:117] "RemoveContainer" containerID="ffce8eef24fa181e55bdb54f45a1c8410fcd87fc6e7a680c21c24b720bd3c883" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.828700 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerStarted","Data":"521d04bc89081492eb783c4a5eecb0ebbd05faf0c47ed6a43e0516b78d2b1e69"} Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.861052 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.868781 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.877337 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.878624 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.880001 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.895619 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.965976 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.966248 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jp8\" (UniqueName: \"kubernetes.io/projected/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-kube-api-access-k9jp8\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:21 crc kubenswrapper[4687]: I0228 09:22:21.966712 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-config-data\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.069181 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.069381 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jp8\" (UniqueName: \"kubernetes.io/projected/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-kube-api-access-k9jp8\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.069599 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-config-data\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.077606 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.079548 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-config-data\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.085557 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jp8\" (UniqueName: \"kubernetes.io/projected/5586e3ed-9ec4-4c0f-9d31-57120488f2cd-kube-api-access-k9jp8\") pod \"nova-scheduler-0\" (UID: \"5586e3ed-9ec4-4c0f-9d31-57120488f2cd\") " pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.192942 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.456549 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.528786 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.579352 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-combined-ca-bundle\") pod \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.579626 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aac36df9-d0ba-430a-9d78-ed35d6f0723a-logs\") pod \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.580100 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-nova-metadata-tls-certs\") pod \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.580135 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-config-data\") pod \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.580615 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsr4d\" (UniqueName: \"kubernetes.io/projected/aac36df9-d0ba-430a-9d78-ed35d6f0723a-kube-api-access-jsr4d\") pod \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\" (UID: \"aac36df9-d0ba-430a-9d78-ed35d6f0723a\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.580959 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54hqp\" (UniqueName: \"kubernetes.io/projected/001b7c85-0b9f-4fdb-83b7-687c36587331-kube-api-access-54hqp\") pod \"001b7c85-0b9f-4fdb-83b7-687c36587331\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.581320 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-combined-ca-bundle\") pod \"001b7c85-0b9f-4fdb-83b7-687c36587331\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.582203 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001b7c85-0b9f-4fdb-83b7-687c36587331-logs\") pod \"001b7c85-0b9f-4fdb-83b7-687c36587331\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.582874 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aac36df9-d0ba-430a-9d78-ed35d6f0723a-logs" (OuterVolumeSpecName: "logs") pod "aac36df9-d0ba-430a-9d78-ed35d6f0723a" (UID: "aac36df9-d0ba-430a-9d78-ed35d6f0723a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.583953 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/001b7c85-0b9f-4fdb-83b7-687c36587331-logs" (OuterVolumeSpecName: "logs") pod "001b7c85-0b9f-4fdb-83b7-687c36587331" (UID: "001b7c85-0b9f-4fdb-83b7-687c36587331"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.584145 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-config-data\") pod \"001b7c85-0b9f-4fdb-83b7-687c36587331\" (UID: \"001b7c85-0b9f-4fdb-83b7-687c36587331\") " Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.585178 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aac36df9-d0ba-430a-9d78-ed35d6f0723a-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.585254 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/001b7c85-0b9f-4fdb-83b7-687c36587331-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.586613 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac36df9-d0ba-430a-9d78-ed35d6f0723a-kube-api-access-jsr4d" (OuterVolumeSpecName: "kube-api-access-jsr4d") pod "aac36df9-d0ba-430a-9d78-ed35d6f0723a" (UID: "aac36df9-d0ba-430a-9d78-ed35d6f0723a"). InnerVolumeSpecName "kube-api-access-jsr4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.586947 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001b7c85-0b9f-4fdb-83b7-687c36587331-kube-api-access-54hqp" (OuterVolumeSpecName: "kube-api-access-54hqp") pod "001b7c85-0b9f-4fdb-83b7-687c36587331" (UID: "001b7c85-0b9f-4fdb-83b7-687c36587331"). InnerVolumeSpecName "kube-api-access-54hqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.604666 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-config-data" (OuterVolumeSpecName: "config-data") pod "001b7c85-0b9f-4fdb-83b7-687c36587331" (UID: "001b7c85-0b9f-4fdb-83b7-687c36587331"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.605998 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-config-data" (OuterVolumeSpecName: "config-data") pod "aac36df9-d0ba-430a-9d78-ed35d6f0723a" (UID: "aac36df9-d0ba-430a-9d78-ed35d6f0723a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.606335 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aac36df9-d0ba-430a-9d78-ed35d6f0723a" (UID: "aac36df9-d0ba-430a-9d78-ed35d6f0723a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.606596 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "001b7c85-0b9f-4fdb-83b7-687c36587331" (UID: "001b7c85-0b9f-4fdb-83b7-687c36587331"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.631987 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aac36df9-d0ba-430a-9d78-ed35d6f0723a" (UID: "aac36df9-d0ba-430a-9d78-ed35d6f0723a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.665696 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542410c5-adcf-424e-966c-9c919abe28fc" path="/var/lib/kubelet/pods/542410c5-adcf-424e-966c-9c919abe28fc/volumes" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.666392 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77c646e3-3eb4-488f-b3ac-34feb004a255" path="/var/lib/kubelet/pods/77c646e3-3eb4-488f-b3ac-34feb004a255/volumes" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.687447 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54hqp\" (UniqueName: \"kubernetes.io/projected/001b7c85-0b9f-4fdb-83b7-687c36587331-kube-api-access-54hqp\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.687470 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.690601 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7c85-0b9f-4fdb-83b7-687c36587331-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.690627 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.690639 4687 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.690652 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aac36df9-d0ba-430a-9d78-ed35d6f0723a-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.690662 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsr4d\" (UniqueName: \"kubernetes.io/projected/aac36df9-d0ba-430a-9d78-ed35d6f0723a-kube-api-access-jsr4d\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.734812 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: W0228 09:22:22.735164 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5586e3ed_9ec4_4c0f_9d31_57120488f2cd.slice/crio-c234fe2cf3def2861a7ba5fc7b60ee6687649882c830d3453df5e44ca85e489f WatchSource:0}: Error finding container c234fe2cf3def2861a7ba5fc7b60ee6687649882c830d3453df5e44ca85e489f: Status 404 returned error can't find the container with id c234fe2cf3def2861a7ba5fc7b60ee6687649882c830d3453df5e44ca85e489f Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.847382 4687 generic.go:334] "Generic (PLEG): container finished" podID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerID="9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0" exitCode=0 Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.847458 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.847499 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aac36df9-d0ba-430a-9d78-ed35d6f0723a","Type":"ContainerDied","Data":"9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0"} Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.848669 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aac36df9-d0ba-430a-9d78-ed35d6f0723a","Type":"ContainerDied","Data":"0d2a79e172ba2e22733a0edfe9bbc4f17fbe1d30f22f5143cb2725c1ebec623d"} Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.848703 4687 scope.go:117] "RemoveContainer" containerID="9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.852531 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerStarted","Data":"90da960ed34c192f62c5d2f924c1f01b0b901ddb7b46e74dbaeabcc9c2642e18"} Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.861341 4687 generic.go:334] "Generic (PLEG): container finished" podID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerID="3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a" exitCode=0 Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.861478 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.861511 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"001b7c85-0b9f-4fdb-83b7-687c36587331","Type":"ContainerDied","Data":"3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a"} Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.861532 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"001b7c85-0b9f-4fdb-83b7-687c36587331","Type":"ContainerDied","Data":"c389697740556c3634fb7f7a53ee374b827abbe5962839c5cf201bd650f0424a"} Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.866576 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5586e3ed-9ec4-4c0f-9d31-57120488f2cd","Type":"ContainerStarted","Data":"c234fe2cf3def2861a7ba5fc7b60ee6687649882c830d3453df5e44ca85e489f"} Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.881180 4687 scope.go:117] "RemoveContainer" containerID="13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.904524 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.918227 4687 scope.go:117] "RemoveContainer" containerID="9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.918681 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0\": container with ID starting with 9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0 not found: ID does not exist" containerID="9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.918717 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0"} err="failed to get container status \"9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0\": rpc error: code = NotFound desc = could not find container \"9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0\": container with ID starting with 9b5dbe698d6d097dbd6ea1ae62c9c7dad0b471305ffc83095af9428e528ff4e0 not found: ID does not exist" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.918744 4687 scope.go:117] "RemoveContainer" containerID="13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.919192 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11\": container with ID starting with 13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11 not found: ID does not exist" containerID="13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.919214 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11"} err="failed to get container status \"13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11\": rpc error: code = NotFound desc = could not find container \"13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11\": container with ID starting with 13fed5c322d0915d2ca9046663a9caf4f412ff33ed5d4b012a406a334d3b3e11 not found: ID does not exist" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.919230 4687 scope.go:117] "RemoveContainer" containerID="3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.928094 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.943848 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.953599 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.962328 4687 scope.go:117] "RemoveContainer" containerID="81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.962439 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.963354 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-log" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.963373 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-log" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.963388 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-metadata" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.963394 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-metadata" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.963422 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-log" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.963428 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-log" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.963450 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-api" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.963457 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-api" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.963660 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-log" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.964123 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" containerName="nova-metadata-metadata" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.964160 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-log" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.964170 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" containerName="nova-api-api" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.965425 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.967471 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.969517 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.969773 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.976531 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.978084 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.980083 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.992763 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.993489 4687 scope.go:117] "RemoveContainer" containerID="3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995068 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995144 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995197 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-logs\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995251 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-config-data\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995367 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hkn\" (UniqueName: \"kubernetes.io/projected/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-kube-api-access-r5hkn\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995401 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995485 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-config-data\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995506 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks66l\" (UniqueName: \"kubernetes.io/projected/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-kube-api-access-ks66l\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.995543 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-logs\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.998327 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a\": container with ID starting with 3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a not found: ID does not exist" containerID="3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.998382 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a"} err="failed to get container status \"3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a\": rpc error: code = NotFound desc = could not find container \"3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a\": container with ID starting with 3f1c18f4caff7da3666963532d3e6b56ccb0ba354e6a147d2eee113eb58a7e5a not found: ID does not exist" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.998417 4687 scope.go:117] "RemoveContainer" containerID="81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1" Feb 28 09:22:22 crc kubenswrapper[4687]: E0228 09:22:22.998847 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1\": container with ID starting with 81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1 not found: ID does not exist" containerID="81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1" Feb 28 09:22:22 crc kubenswrapper[4687]: I0228 09:22:22.998874 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1"} err="failed to get container status \"81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1\": rpc error: code = NotFound desc = could not find container \"81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1\": container with ID starting with 81f9989d28a11c04777a5fa1d0f91924ea262d24ce3c8f1869e19b0fd5b338a1 not found: ID does not exist" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097153 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097340 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-logs\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097433 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-config-data\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097557 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hkn\" (UniqueName: \"kubernetes.io/projected/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-kube-api-access-r5hkn\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097632 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097737 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-config-data\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097811 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks66l\" (UniqueName: \"kubernetes.io/projected/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-kube-api-access-ks66l\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097886 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-logs\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.097969 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.099403 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-logs\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.101111 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-logs\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.101421 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.102336 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.105138 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-config-data\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.105357 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-config-data\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.108001 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.116444 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hkn\" (UniqueName: \"kubernetes.io/projected/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-kube-api-access-r5hkn\") pod \"nova-api-0\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.116775 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks66l\" (UniqueName: \"kubernetes.io/projected/8d4ccf04-08de-4138-ba4a-b8f5659a37fc-kube-api-access-ks66l\") pod \"nova-metadata-0\" (UID: \"8d4ccf04-08de-4138-ba4a-b8f5659a37fc\") " pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.292865 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.299908 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.725480 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.764431 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 28 09:22:23 crc kubenswrapper[4687]: W0228 09:22:23.772457 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d4ccf04_08de_4138_ba4a_b8f5659a37fc.slice/crio-1c3547589709f5b3ddc4286859bc640b9344ea0591939a7e4d58c10a2ee7c42c WatchSource:0}: Error finding container 1c3547589709f5b3ddc4286859bc640b9344ea0591939a7e4d58c10a2ee7c42c: Status 404 returned error can't find the container with id 1c3547589709f5b3ddc4286859bc640b9344ea0591939a7e4d58c10a2ee7c42c Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.878890 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d4ccf04-08de-4138-ba4a-b8f5659a37fc","Type":"ContainerStarted","Data":"1c3547589709f5b3ddc4286859bc640b9344ea0591939a7e4d58c10a2ee7c42c"} Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.882278 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerStarted","Data":"77ed7c1c1f17142acf765d748913a61c95f252bb07234c31049e2fcda7b9b5b0"} Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.885484 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b65eafc1-4ab2-47ec-ac65-5b9b8174833a","Type":"ContainerStarted","Data":"37130d5102268bd7a8aeb68eae7bef3349684ee5951c5a8d8453027d65fd4187"} Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.887425 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5586e3ed-9ec4-4c0f-9d31-57120488f2cd","Type":"ContainerStarted","Data":"6861ad46ac8465e79509a131739d2c62e528661344725e4080c529b9fb2897f5"} Feb 28 09:22:23 crc kubenswrapper[4687]: I0228 09:22:23.909473 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9094567700000002 podStartE2EDuration="2.90945677s" podCreationTimestamp="2026-02-28 09:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:23.901236503 +0000 UTC m=+1135.591805840" watchObservedRunningTime="2026-02-28 09:22:23.90945677 +0000 UTC m=+1135.600026107" Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.667465 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001b7c85-0b9f-4fdb-83b7-687c36587331" path="/var/lib/kubelet/pods/001b7c85-0b9f-4fdb-83b7-687c36587331/volumes" Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.668749 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac36df9-d0ba-430a-9d78-ed35d6f0723a" path="/var/lib/kubelet/pods/aac36df9-d0ba-430a-9d78-ed35d6f0723a/volumes" Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.898276 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerStarted","Data":"f92473cc8680583a887cbeacf84beb53b054d22055b5301bf91da953f58ed9dd"} Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.899899 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b65eafc1-4ab2-47ec-ac65-5b9b8174833a","Type":"ContainerStarted","Data":"3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721"} Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.900008 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b65eafc1-4ab2-47ec-ac65-5b9b8174833a","Type":"ContainerStarted","Data":"5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db"} Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.903039 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d4ccf04-08de-4138-ba4a-b8f5659a37fc","Type":"ContainerStarted","Data":"0740a465f9abe34921bbd5ba4a92e320c26c13ac48062174c9d494daf52934b4"} Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.903171 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d4ccf04-08de-4138-ba4a-b8f5659a37fc","Type":"ContainerStarted","Data":"50d7858ab41e5a514826b4694413fbc29c89f2a4504729c40c7e3c027b6173ab"} Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.924185 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.924165335 podStartE2EDuration="2.924165335s" podCreationTimestamp="2026-02-28 09:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:24.922804064 +0000 UTC m=+1136.613373401" watchObservedRunningTime="2026-02-28 09:22:24.924165335 +0000 UTC m=+1136.614734672" Feb 28 09:22:24 crc kubenswrapper[4687]: I0228 09:22:24.948744 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.948731949 podStartE2EDuration="2.948731949s" podCreationTimestamp="2026-02-28 09:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:24.938903748 +0000 UTC m=+1136.629473085" watchObservedRunningTime="2026-02-28 09:22:24.948731949 +0000 UTC m=+1136.639301286" Feb 28 09:22:25 crc kubenswrapper[4687]: I0228 09:22:25.002396 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:22:25 crc kubenswrapper[4687]: I0228 09:22:25.002456 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:22:25 crc kubenswrapper[4687]: I0228 09:22:25.916609 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerStarted","Data":"7f04d67585432201b32ab0df44fd253294856610d70ddb9b0929390a4537d9cd"} Feb 28 09:22:25 crc kubenswrapper[4687]: I0228 09:22:25.942712 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.199065156 podStartE2EDuration="5.942689934s" podCreationTimestamp="2026-02-28 09:22:20 +0000 UTC" firstStartedPulling="2026-02-28 09:22:21.669781383 +0000 UTC m=+1133.360350720" lastFinishedPulling="2026-02-28 09:22:25.413406151 +0000 UTC m=+1137.103975498" observedRunningTime="2026-02-28 09:22:25.939085075 +0000 UTC m=+1137.629654412" watchObservedRunningTime="2026-02-28 09:22:25.942689934 +0000 UTC m=+1137.633259271" Feb 28 09:22:26 crc kubenswrapper[4687]: I0228 09:22:26.923957 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:22:27 crc kubenswrapper[4687]: I0228 09:22:27.115150 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 28 09:22:27 crc kubenswrapper[4687]: I0228 09:22:27.193760 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 28 09:22:28 crc kubenswrapper[4687]: I0228 09:22:28.293861 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 28 09:22:28 crc kubenswrapper[4687]: I0228 09:22:28.294325 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 28 09:22:32 crc kubenswrapper[4687]: I0228 09:22:32.193858 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 28 09:22:32 crc kubenswrapper[4687]: I0228 09:22:32.221323 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 28 09:22:33 crc kubenswrapper[4687]: I0228 09:22:33.014410 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 28 09:22:33 crc kubenswrapper[4687]: I0228 09:22:33.293468 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 28 09:22:33 crc kubenswrapper[4687]: I0228 09:22:33.294037 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 28 09:22:33 crc kubenswrapper[4687]: I0228 09:22:33.301055 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 28 09:22:33 crc kubenswrapper[4687]: I0228 09:22:33.301182 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 28 09:22:34 crc kubenswrapper[4687]: I0228 09:22:34.392418 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:34 crc kubenswrapper[4687]: I0228 09:22:34.392449 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8d4ccf04-08de-4138-ba4a-b8f5659a37fc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:34 crc kubenswrapper[4687]: I0228 09:22:34.392545 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:34 crc kubenswrapper[4687]: I0228 09:22:34.392603 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8d4ccf04-08de-4138-ba4a-b8f5659a37fc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 09:22:34 crc kubenswrapper[4687]: I0228 09:22:34.598496 4687 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod70abdfed-0686-450a-b900-2eda9b68cec7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod70abdfed-0686-450a-b900-2eda9b68cec7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod70abdfed_0686_450a_b900_2eda9b68cec7.slice" Feb 28 09:22:34 crc kubenswrapper[4687]: E0228 09:22:34.598553 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod70abdfed-0686-450a-b900-2eda9b68cec7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod70abdfed-0686-450a-b900-2eda9b68cec7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod70abdfed_0686_450a_b900_2eda9b68cec7.slice" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" podUID="70abdfed-0686-450a-b900-2eda9b68cec7" Feb 28 09:22:35 crc kubenswrapper[4687]: I0228 09:22:35.012159 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537842-cgjsw" Feb 28 09:22:40 crc kubenswrapper[4687]: I0228 09:22:40.959852 4687 scope.go:117] "RemoveContainer" containerID="3e9f05e5348087ae22fa99f6c6e7c935f7a19de5011e2a8e76f694474f6eb090" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.299552 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.302648 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.305253 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.305313 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.305586 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.305623 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.306000 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.307628 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.308215 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.471168 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-67mrm"] Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.472693 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.486380 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-67mrm"] Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.563559 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-svc\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.563597 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqkj\" (UniqueName: \"kubernetes.io/projected/d393de87-edb5-4ebd-986b-2857110b1706-kube-api-access-xjqkj\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.563644 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.563676 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-config\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.563882 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.564171 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665085 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-config\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665474 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665558 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665665 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqkj\" (UniqueName: \"kubernetes.io/projected/d393de87-edb5-4ebd-986b-2857110b1706-kube-api-access-xjqkj\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665686 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-svc\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665745 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.665880 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-config\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.666578 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-svc\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.666702 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.667275 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.667820 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:43 crc kubenswrapper[4687]: I0228 09:22:43.684861 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqkj\" (UniqueName: \"kubernetes.io/projected/d393de87-edb5-4ebd-986b-2857110b1706-kube-api-access-xjqkj\") pod \"dnsmasq-dns-7749c44969-67mrm\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:44 crc kubenswrapper[4687]: I0228 09:22:43.790982 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:44 crc kubenswrapper[4687]: I0228 09:22:44.136972 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 28 09:22:44 crc kubenswrapper[4687]: I0228 09:22:44.548227 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-67mrm"] Feb 28 09:22:44 crc kubenswrapper[4687]: W0228 09:22:44.550453 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd393de87_edb5_4ebd_986b_2857110b1706.slice/crio-c4797f5ebd943aef8bb63af5b32b7b9f9b651aaa2e9f37ebebd76f92975ebc80 WatchSource:0}: Error finding container c4797f5ebd943aef8bb63af5b32b7b9f9b651aaa2e9f37ebebd76f92975ebc80: Status 404 returned error can't find the container with id c4797f5ebd943aef8bb63af5b32b7b9f9b651aaa2e9f37ebebd76f92975ebc80 Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.148642 4687 generic.go:334] "Generic (PLEG): container finished" podID="d393de87-edb5-4ebd-986b-2857110b1706" containerID="62974ac7a3bb4f4fbf6aacafa370e1439a62488eb145b09d0e3ac042be13443d" exitCode=0 Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.150297 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-67mrm" event={"ID":"d393de87-edb5-4ebd-986b-2857110b1706","Type":"ContainerDied","Data":"62974ac7a3bb4f4fbf6aacafa370e1439a62488eb145b09d0e3ac042be13443d"} Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.150333 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-67mrm" event={"ID":"d393de87-edb5-4ebd-986b-2857110b1706","Type":"ContainerStarted","Data":"c4797f5ebd943aef8bb63af5b32b7b9f9b651aaa2e9f37ebebd76f92975ebc80"} Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.219095 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.219392 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-central-agent" containerID="cri-o://90da960ed34c192f62c5d2f924c1f01b0b901ddb7b46e74dbaeabcc9c2642e18" gracePeriod=30 Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.219446 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="proxy-httpd" containerID="cri-o://7f04d67585432201b32ab0df44fd253294856610d70ddb9b0929390a4537d9cd" gracePeriod=30 Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.219492 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="sg-core" containerID="cri-o://f92473cc8680583a887cbeacf84beb53b054d22055b5301bf91da953f58ed9dd" gracePeriod=30 Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.219508 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-notification-agent" containerID="cri-o://77ed7c1c1f17142acf765d748913a61c95f252bb07234c31049e2fcda7b9b5b0" gracePeriod=30 Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.322429 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.207:3000/\": read tcp 10.217.0.2:53774->10.217.0.207:3000: read: connection reset by peer" Feb 28 09:22:45 crc kubenswrapper[4687]: I0228 09:22:45.718117 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.163870 4687 generic.go:334] "Generic (PLEG): container finished" podID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerID="7f04d67585432201b32ab0df44fd253294856610d70ddb9b0929390a4537d9cd" exitCode=0 Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164232 4687 generic.go:334] "Generic (PLEG): container finished" podID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerID="f92473cc8680583a887cbeacf84beb53b054d22055b5301bf91da953f58ed9dd" exitCode=2 Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164244 4687 generic.go:334] "Generic (PLEG): container finished" podID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerID="77ed7c1c1f17142acf765d748913a61c95f252bb07234c31049e2fcda7b9b5b0" exitCode=0 Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164256 4687 generic.go:334] "Generic (PLEG): container finished" podID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerID="90da960ed34c192f62c5d2f924c1f01b0b901ddb7b46e74dbaeabcc9c2642e18" exitCode=0 Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164338 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerDied","Data":"7f04d67585432201b32ab0df44fd253294856610d70ddb9b0929390a4537d9cd"} Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164374 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerDied","Data":"f92473cc8680583a887cbeacf84beb53b054d22055b5301bf91da953f58ed9dd"} Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164385 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerDied","Data":"77ed7c1c1f17142acf765d748913a61c95f252bb07234c31049e2fcda7b9b5b0"} Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.164394 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerDied","Data":"90da960ed34c192f62c5d2f924c1f01b0b901ddb7b46e74dbaeabcc9c2642e18"} Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.166518 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-67mrm" event={"ID":"d393de87-edb5-4ebd-986b-2857110b1706","Type":"ContainerStarted","Data":"7bc95be36b18ee7280ad5ee6d217184930449b81e6794a3ec32cf140c198b50e"} Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.166730 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.166896 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-log" containerID="cri-o://5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db" gracePeriod=30 Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.166947 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-api" containerID="cri-o://3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721" gracePeriod=30 Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.190800 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-67mrm" podStartSLOduration=3.190772299 podStartE2EDuration="3.190772299s" podCreationTimestamp="2026-02-28 09:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:46.183170815 +0000 UTC m=+1157.873740152" watchObservedRunningTime="2026-02-28 09:22:46.190772299 +0000 UTC m=+1157.881341636" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.286456 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390409 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-scripts\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390508 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-sg-core-conf-yaml\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390556 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-ceilometer-tls-certs\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390610 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b2tm\" (UniqueName: \"kubernetes.io/projected/e664c500-7796-4b13-b10b-f65aa311a2cd-kube-api-access-9b2tm\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390661 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-log-httpd\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390749 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-combined-ca-bundle\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390872 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-run-httpd\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.390932 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-config-data\") pod \"e664c500-7796-4b13-b10b-f65aa311a2cd\" (UID: \"e664c500-7796-4b13-b10b-f65aa311a2cd\") " Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.391490 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.391528 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.391958 4687 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.391983 4687 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e664c500-7796-4b13-b10b-f65aa311a2cd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.396428 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e664c500-7796-4b13-b10b-f65aa311a2cd-kube-api-access-9b2tm" (OuterVolumeSpecName: "kube-api-access-9b2tm") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "kube-api-access-9b2tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.397196 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-scripts" (OuterVolumeSpecName: "scripts") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.418149 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.434440 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.451843 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.464273 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-config-data" (OuterVolumeSpecName: "config-data") pod "e664c500-7796-4b13-b10b-f65aa311a2cd" (UID: "e664c500-7796-4b13-b10b-f65aa311a2cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.495183 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.495212 4687 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.495222 4687 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.495240 4687 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.495252 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b2tm\" (UniqueName: \"kubernetes.io/projected/e664c500-7796-4b13-b10b-f65aa311a2cd-kube-api-access-9b2tm\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:46 crc kubenswrapper[4687]: I0228 09:22:46.495263 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e664c500-7796-4b13-b10b-f65aa311a2cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.182157 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e664c500-7796-4b13-b10b-f65aa311a2cd","Type":"ContainerDied","Data":"521d04bc89081492eb783c4a5eecb0ebbd05faf0c47ed6a43e0516b78d2b1e69"} Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.182204 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.182244 4687 scope.go:117] "RemoveContainer" containerID="7f04d67585432201b32ab0df44fd253294856610d70ddb9b0929390a4537d9cd" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.187911 4687 generic.go:334] "Generic (PLEG): container finished" podID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerID="5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db" exitCode=143 Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.187944 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b65eafc1-4ab2-47ec-ac65-5b9b8174833a","Type":"ContainerDied","Data":"5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db"} Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.209410 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.210661 4687 scope.go:117] "RemoveContainer" containerID="f92473cc8680583a887cbeacf84beb53b054d22055b5301bf91da953f58ed9dd" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.215631 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232162 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:47 crc kubenswrapper[4687]: E0228 09:22:47.232546 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="sg-core" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232566 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="sg-core" Feb 28 09:22:47 crc kubenswrapper[4687]: E0228 09:22:47.232578 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-central-agent" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232585 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-central-agent" Feb 28 09:22:47 crc kubenswrapper[4687]: E0228 09:22:47.232593 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="proxy-httpd" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232600 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="proxy-httpd" Feb 28 09:22:47 crc kubenswrapper[4687]: E0228 09:22:47.232611 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-notification-agent" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232618 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-notification-agent" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232791 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="sg-core" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232806 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-central-agent" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232816 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="ceilometer-notification-agent" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.232826 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" containerName="proxy-httpd" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.234507 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.238533 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.238575 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.238548 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.239006 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.242708 4687 scope.go:117] "RemoveContainer" containerID="77ed7c1c1f17142acf765d748913a61c95f252bb07234c31049e2fcda7b9b5b0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.273985 4687 scope.go:117] "RemoveContainer" containerID="90da960ed34c192f62c5d2f924c1f01b0b901ddb7b46e74dbaeabcc9c2642e18" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.315664 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.315813 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-scripts\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.316237 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.316390 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-config-data\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.316639 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d031a035-5ae3-4544-9181-756dba921ef0-log-httpd\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.316747 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d031a035-5ae3-4544-9181-756dba921ef0-run-httpd\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.316814 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6llt\" (UniqueName: \"kubernetes.io/projected/d031a035-5ae3-4544-9181-756dba921ef0-kube-api-access-b6llt\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.316841 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.419693 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.419993 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-config-data\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420105 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d031a035-5ae3-4544-9181-756dba921ef0-log-httpd\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420239 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d031a035-5ae3-4544-9181-756dba921ef0-run-httpd\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420318 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6llt\" (UniqueName: \"kubernetes.io/projected/d031a035-5ae3-4544-9181-756dba921ef0-kube-api-access-b6llt\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420380 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420470 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420566 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-scripts\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420589 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d031a035-5ae3-4544-9181-756dba921ef0-log-httpd\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.420721 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d031a035-5ae3-4544-9181-756dba921ef0-run-httpd\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.424348 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.424931 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-scripts\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.425402 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-config-data\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.425615 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.430305 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d031a035-5ae3-4544-9181-756dba921ef0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.436627 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6llt\" (UniqueName: \"kubernetes.io/projected/d031a035-5ae3-4544-9181-756dba921ef0-kube-api-access-b6llt\") pod \"ceilometer-0\" (UID: \"d031a035-5ae3-4544-9181-756dba921ef0\") " pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.554163 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 28 09:22:47 crc kubenswrapper[4687]: I0228 09:22:47.974556 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 28 09:22:47 crc kubenswrapper[4687]: W0228 09:22:47.976372 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd031a035_5ae3_4544_9181_756dba921ef0.slice/crio-01780ef010be87c757595a3ba42f59c10454ca7a62d63be571010b1d0528803e WatchSource:0}: Error finding container 01780ef010be87c757595a3ba42f59c10454ca7a62d63be571010b1d0528803e: Status 404 returned error can't find the container with id 01780ef010be87c757595a3ba42f59c10454ca7a62d63be571010b1d0528803e Feb 28 09:22:48 crc kubenswrapper[4687]: I0228 09:22:48.196842 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d031a035-5ae3-4544-9181-756dba921ef0","Type":"ContainerStarted","Data":"01780ef010be87c757595a3ba42f59c10454ca7a62d63be571010b1d0528803e"} Feb 28 09:22:48 crc kubenswrapper[4687]: I0228 09:22:48.667595 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e664c500-7796-4b13-b10b-f65aa311a2cd" path="/var/lib/kubelet/pods/e664c500-7796-4b13-b10b-f65aa311a2cd/volumes" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.212668 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d031a035-5ae3-4544-9181-756dba921ef0","Type":"ContainerStarted","Data":"94439e51fbc08adefe2ab368538146cfeb1c48d9afc0acc6fc09248fc0512df0"} Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.684870 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.779425 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-combined-ca-bundle\") pod \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.779685 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5hkn\" (UniqueName: \"kubernetes.io/projected/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-kube-api-access-r5hkn\") pod \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.779736 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-config-data\") pod \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.779801 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-logs\") pod \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\" (UID: \"b65eafc1-4ab2-47ec-ac65-5b9b8174833a\") " Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.780365 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-logs" (OuterVolumeSpecName: "logs") pod "b65eafc1-4ab2-47ec-ac65-5b9b8174833a" (UID: "b65eafc1-4ab2-47ec-ac65-5b9b8174833a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.790636 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-kube-api-access-r5hkn" (OuterVolumeSpecName: "kube-api-access-r5hkn") pod "b65eafc1-4ab2-47ec-ac65-5b9b8174833a" (UID: "b65eafc1-4ab2-47ec-ac65-5b9b8174833a"). InnerVolumeSpecName "kube-api-access-r5hkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.802926 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-config-data" (OuterVolumeSpecName: "config-data") pod "b65eafc1-4ab2-47ec-ac65-5b9b8174833a" (UID: "b65eafc1-4ab2-47ec-ac65-5b9b8174833a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.812618 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b65eafc1-4ab2-47ec-ac65-5b9b8174833a" (UID: "b65eafc1-4ab2-47ec-ac65-5b9b8174833a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.887997 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.888038 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5hkn\" (UniqueName: \"kubernetes.io/projected/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-kube-api-access-r5hkn\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.888052 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:49 crc kubenswrapper[4687]: I0228 09:22:49.888060 4687 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65eafc1-4ab2-47ec-ac65-5b9b8174833a-logs\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.226132 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d031a035-5ae3-4544-9181-756dba921ef0","Type":"ContainerStarted","Data":"a24a8d45d9e697bb8b52f7bae4c84dc08cd3cb6d024ac2f21c03b2a7df33520d"} Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.226564 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d031a035-5ae3-4544-9181-756dba921ef0","Type":"ContainerStarted","Data":"f6688c3e452de146c5ce6882869f6dd0bdb3139f1ffe6d6a3ede491d5e1b606b"} Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.239665 4687 generic.go:334] "Generic (PLEG): container finished" podID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerID="3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721" exitCode=0 Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.239731 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b65eafc1-4ab2-47ec-ac65-5b9b8174833a","Type":"ContainerDied","Data":"3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721"} Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.239772 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b65eafc1-4ab2-47ec-ac65-5b9b8174833a","Type":"ContainerDied","Data":"37130d5102268bd7a8aeb68eae7bef3349684ee5951c5a8d8453027d65fd4187"} Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.239778 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.239795 4687 scope.go:117] "RemoveContainer" containerID="3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.271943 4687 scope.go:117] "RemoveContainer" containerID="5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.289095 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.295486 4687 scope.go:117] "RemoveContainer" containerID="3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721" Feb 28 09:22:50 crc kubenswrapper[4687]: E0228 09:22:50.295896 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721\": container with ID starting with 3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721 not found: ID does not exist" containerID="3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.295936 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721"} err="failed to get container status \"3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721\": rpc error: code = NotFound desc = could not find container \"3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721\": container with ID starting with 3a9210793c3823a5913e3ac6a049bf279b261a8dfa4e3a1f8de61c0439906721 not found: ID does not exist" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.295971 4687 scope.go:117] "RemoveContainer" containerID="5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db" Feb 28 09:22:50 crc kubenswrapper[4687]: E0228 09:22:50.296540 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db\": container with ID starting with 5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db not found: ID does not exist" containerID="5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.296588 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db"} err="failed to get container status \"5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db\": rpc error: code = NotFound desc = could not find container \"5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db\": container with ID starting with 5b77c69d2a891abbb5966b518241f3d7d712a97e81262ac57ec3599b65d085db not found: ID does not exist" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.300461 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.326290 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:50 crc kubenswrapper[4687]: E0228 09:22:50.331010 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-log" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.331048 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-log" Feb 28 09:22:50 crc kubenswrapper[4687]: E0228 09:22:50.331091 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-api" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.331101 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-api" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.331776 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-api" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.331805 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" containerName="nova-api-log" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.333665 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.336690 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.336838 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.337036 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.356888 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.399780 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-public-tls-certs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.399852 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-config-data\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.399878 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.399901 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.399961 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36f861b-f068-4184-bca3-ef07c5d8cec5-logs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.400003 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v45wf\" (UniqueName: \"kubernetes.io/projected/a36f861b-f068-4184-bca3-ef07c5d8cec5-kube-api-access-v45wf\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501192 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v45wf\" (UniqueName: \"kubernetes.io/projected/a36f861b-f068-4184-bca3-ef07c5d8cec5-kube-api-access-v45wf\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501261 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-public-tls-certs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501307 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-config-data\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501332 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501351 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501409 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36f861b-f068-4184-bca3-ef07c5d8cec5-logs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.501869 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36f861b-f068-4184-bca3-ef07c5d8cec5-logs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.507763 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-public-tls-certs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.508309 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.508815 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-config-data\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.516567 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v45wf\" (UniqueName: \"kubernetes.io/projected/a36f861b-f068-4184-bca3-ef07c5d8cec5-kube-api-access-v45wf\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.520653 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36f861b-f068-4184-bca3-ef07c5d8cec5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a36f861b-f068-4184-bca3-ef07c5d8cec5\") " pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.657236 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 28 09:22:50 crc kubenswrapper[4687]: I0228 09:22:50.671452 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65eafc1-4ab2-47ec-ac65-5b9b8174833a" path="/var/lib/kubelet/pods/b65eafc1-4ab2-47ec-ac65-5b9b8174833a/volumes" Feb 28 09:22:51 crc kubenswrapper[4687]: I0228 09:22:51.081649 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 28 09:22:51 crc kubenswrapper[4687]: W0228 09:22:51.081963 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda36f861b_f068_4184_bca3_ef07c5d8cec5.slice/crio-af138de33caf2083eebd1530763d894c47e4c1d93a0ad0d01ce601b7074cf7c7 WatchSource:0}: Error finding container af138de33caf2083eebd1530763d894c47e4c1d93a0ad0d01ce601b7074cf7c7: Status 404 returned error can't find the container with id af138de33caf2083eebd1530763d894c47e4c1d93a0ad0d01ce601b7074cf7c7 Feb 28 09:22:51 crc kubenswrapper[4687]: I0228 09:22:51.252439 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a36f861b-f068-4184-bca3-ef07c5d8cec5","Type":"ContainerStarted","Data":"01efc740ca7a08d2140b43a34cd4d00ea6c11c820e4874d4ce62040f2cecf576"} Feb 28 09:22:51 crc kubenswrapper[4687]: I0228 09:22:51.252855 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a36f861b-f068-4184-bca3-ef07c5d8cec5","Type":"ContainerStarted","Data":"af138de33caf2083eebd1530763d894c47e4c1d93a0ad0d01ce601b7074cf7c7"} Feb 28 09:22:52 crc kubenswrapper[4687]: I0228 09:22:52.268408 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a36f861b-f068-4184-bca3-ef07c5d8cec5","Type":"ContainerStarted","Data":"6d20b309132d4980a46dd0f5387ff768c37122244b72881f26fd422cae8de58e"} Feb 28 09:22:52 crc kubenswrapper[4687]: I0228 09:22:52.271551 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d031a035-5ae3-4544-9181-756dba921ef0","Type":"ContainerStarted","Data":"1ce6998d4b7e5aca34278204bf88f34a8277757653be3d45ebd113226e2dc7f7"} Feb 28 09:22:52 crc kubenswrapper[4687]: I0228 09:22:52.272241 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 28 09:22:52 crc kubenswrapper[4687]: I0228 09:22:52.298052 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.298038017 podStartE2EDuration="2.298038017s" podCreationTimestamp="2026-02-28 09:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:22:52.287864196 +0000 UTC m=+1163.978433533" watchObservedRunningTime="2026-02-28 09:22:52.298038017 +0000 UTC m=+1163.988607354" Feb 28 09:22:52 crc kubenswrapper[4687]: I0228 09:22:52.318429 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.677604828 podStartE2EDuration="5.318410856s" podCreationTimestamp="2026-02-28 09:22:47 +0000 UTC" firstStartedPulling="2026-02-28 09:22:47.978920745 +0000 UTC m=+1159.669490082" lastFinishedPulling="2026-02-28 09:22:51.619726773 +0000 UTC m=+1163.310296110" observedRunningTime="2026-02-28 09:22:52.314252987 +0000 UTC m=+1164.004822325" watchObservedRunningTime="2026-02-28 09:22:52.318410856 +0000 UTC m=+1164.008980193" Feb 28 09:22:53 crc kubenswrapper[4687]: I0228 09:22:53.792239 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:22:53 crc kubenswrapper[4687]: I0228 09:22:53.853424 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-84mdh"] Feb 28 09:22:53 crc kubenswrapper[4687]: I0228 09:22:53.854297 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerName="dnsmasq-dns" containerID="cri-o://9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45" gracePeriod=10 Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.311212 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.312562 4687 generic.go:334] "Generic (PLEG): container finished" podID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerID="9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45" exitCode=0 Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.312750 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" event={"ID":"56ba6f0a-8cc9-41a9-9444-5e338bd8a300","Type":"ContainerDied","Data":"9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45"} Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.312790 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" event={"ID":"56ba6f0a-8cc9-41a9-9444-5e338bd8a300","Type":"ContainerDied","Data":"1e5a8baa59919f6ec0c426d39c8edd513acb42eb01510b2ededf8fb479aaebe9"} Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.312815 4687 scope.go:117] "RemoveContainer" containerID="9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.339474 4687 scope.go:117] "RemoveContainer" containerID="0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.366148 4687 scope.go:117] "RemoveContainer" containerID="9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45" Feb 28 09:22:54 crc kubenswrapper[4687]: E0228 09:22:54.366750 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45\": container with ID starting with 9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45 not found: ID does not exist" containerID="9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.366784 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45"} err="failed to get container status \"9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45\": rpc error: code = NotFound desc = could not find container \"9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45\": container with ID starting with 9b75fc2922aafa021e089557a41ded5331bfaa304c114d21b24ef171a7c92f45 not found: ID does not exist" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.366811 4687 scope.go:117] "RemoveContainer" containerID="0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed" Feb 28 09:22:54 crc kubenswrapper[4687]: E0228 09:22:54.367181 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed\": container with ID starting with 0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed not found: ID does not exist" containerID="0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.367209 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed"} err="failed to get container status \"0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed\": rpc error: code = NotFound desc = could not find container \"0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed\": container with ID starting with 0246347d2a87ea04c4eadcf8749195860b6c2f9e1dc6da05148ce3725b2a30ed not found: ID does not exist" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.406181 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-nb\") pod \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.406371 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf492\" (UniqueName: \"kubernetes.io/projected/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-kube-api-access-jf492\") pod \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.406499 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-sb\") pod \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.406595 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-swift-storage-0\") pod \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.406766 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-config\") pod \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.406887 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-svc\") pod \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\" (UID: \"56ba6f0a-8cc9-41a9-9444-5e338bd8a300\") " Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.412116 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-kube-api-access-jf492" (OuterVolumeSpecName: "kube-api-access-jf492") pod "56ba6f0a-8cc9-41a9-9444-5e338bd8a300" (UID: "56ba6f0a-8cc9-41a9-9444-5e338bd8a300"). InnerVolumeSpecName "kube-api-access-jf492". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.447207 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56ba6f0a-8cc9-41a9-9444-5e338bd8a300" (UID: "56ba6f0a-8cc9-41a9-9444-5e338bd8a300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.449030 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-config" (OuterVolumeSpecName: "config") pod "56ba6f0a-8cc9-41a9-9444-5e338bd8a300" (UID: "56ba6f0a-8cc9-41a9-9444-5e338bd8a300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.449046 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56ba6f0a-8cc9-41a9-9444-5e338bd8a300" (UID: "56ba6f0a-8cc9-41a9-9444-5e338bd8a300"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.450917 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56ba6f0a-8cc9-41a9-9444-5e338bd8a300" (UID: "56ba6f0a-8cc9-41a9-9444-5e338bd8a300"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.454445 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56ba6f0a-8cc9-41a9-9444-5e338bd8a300" (UID: "56ba6f0a-8cc9-41a9-9444-5e338bd8a300"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.510728 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf492\" (UniqueName: \"kubernetes.io/projected/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-kube-api-access-jf492\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.510783 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.510794 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.510808 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.510820 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:54 crc kubenswrapper[4687]: I0228 09:22:54.510829 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ba6f0a-8cc9-41a9-9444-5e338bd8a300-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.002609 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.002914 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.002965 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.003896 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70e6449ca6d918497ca91c82bcac17a1011e8ea5698b1bdf893e712bee9903d3"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.003955 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://70e6449ca6d918497ca91c82bcac17a1011e8ea5698b1bdf893e712bee9903d3" gracePeriod=600 Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.323299 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-84mdh" Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.326647 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="70e6449ca6d918497ca91c82bcac17a1011e8ea5698b1bdf893e712bee9903d3" exitCode=0 Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.326716 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"70e6449ca6d918497ca91c82bcac17a1011e8ea5698b1bdf893e712bee9903d3"} Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.326805 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"26defbc0a15ba55a0f8e3a7678fa01c73c7ea2162c34ef63cf8b44425106ed7e"} Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.326838 4687 scope.go:117] "RemoveContainer" containerID="f16534f65e44ed5dcb5a741301bfadba47516c592259f18b72f5912611ebb09f" Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.347523 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-84mdh"] Feb 28 09:22:55 crc kubenswrapper[4687]: I0228 09:22:55.356703 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-84mdh"] Feb 28 09:22:56 crc kubenswrapper[4687]: I0228 09:22:56.669432 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" path="/var/lib/kubelet/pods/56ba6f0a-8cc9-41a9-9444-5e338bd8a300/volumes" Feb 28 09:23:00 crc kubenswrapper[4687]: I0228 09:23:00.668826 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 28 09:23:00 crc kubenswrapper[4687]: I0228 09:23:00.669597 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 28 09:23:01 crc kubenswrapper[4687]: I0228 09:23:01.672149 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a36f861b-f068-4184-bca3-ef07c5d8cec5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 28 09:23:01 crc kubenswrapper[4687]: I0228 09:23:01.672149 4687 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a36f861b-f068-4184-bca3-ef07c5d8cec5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 28 09:23:10 crc kubenswrapper[4687]: I0228 09:23:10.665584 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 28 09:23:10 crc kubenswrapper[4687]: I0228 09:23:10.666543 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 28 09:23:10 crc kubenswrapper[4687]: I0228 09:23:10.669345 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 28 09:23:10 crc kubenswrapper[4687]: I0228 09:23:10.673723 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 28 09:23:11 crc kubenswrapper[4687]: I0228 09:23:11.488005 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 28 09:23:11 crc kubenswrapper[4687]: I0228 09:23:11.493958 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 28 09:23:17 crc kubenswrapper[4687]: I0228 09:23:17.563629 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 28 09:23:25 crc kubenswrapper[4687]: I0228 09:23:25.468946 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:23:26 crc kubenswrapper[4687]: I0228 09:23:26.307373 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:23:29 crc kubenswrapper[4687]: I0228 09:23:29.475919 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="rabbitmq" containerID="cri-o://cc10b6b23a3eab63c3944f46eeb03c0ab55ae001902fe5a9f2a6bae319a6709d" gracePeriod=604796 Feb 28 09:23:30 crc kubenswrapper[4687]: I0228 09:23:30.588205 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Feb 28 09:23:30 crc kubenswrapper[4687]: I0228 09:23:30.728572 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerName="rabbitmq" containerID="cri-o://96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1" gracePeriod=604796 Feb 28 09:23:35 crc kubenswrapper[4687]: I0228 09:23:35.739148 4687 generic.go:334] "Generic (PLEG): container finished" podID="541f5799-4b5e-4767-aca7-8c3738502a06" containerID="cc10b6b23a3eab63c3944f46eeb03c0ab55ae001902fe5a9f2a6bae319a6709d" exitCode=0 Feb 28 09:23:35 crc kubenswrapper[4687]: I0228 09:23:35.739236 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"541f5799-4b5e-4767-aca7-8c3738502a06","Type":"ContainerDied","Data":"cc10b6b23a3eab63c3944f46eeb03c0ab55ae001902fe5a9f2a6bae319a6709d"} Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.042098 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.135838 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-tls\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136331 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-config-data\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136432 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp667\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-kube-api-access-jp667\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136589 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/541f5799-4b5e-4767-aca7-8c3738502a06-pod-info\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136727 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-server-conf\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136808 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-confd\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136893 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.136995 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-plugins-conf\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.137120 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-erlang-cookie\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.137219 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/541f5799-4b5e-4767-aca7-8c3738502a06-erlang-cookie-secret\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.137284 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-plugins\") pod \"541f5799-4b5e-4767-aca7-8c3738502a06\" (UID: \"541f5799-4b5e-4767-aca7-8c3738502a06\") " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.138587 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.138788 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.138993 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.148105 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/541f5799-4b5e-4767-aca7-8c3738502a06-pod-info" (OuterVolumeSpecName: "pod-info") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.148106 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541f5799-4b5e-4767-aca7-8c3738502a06-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.148266 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-kube-api-access-jp667" (OuterVolumeSpecName: "kube-api-access-jp667") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "kube-api-access-jp667". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.149172 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.149323 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.171784 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-config-data" (OuterVolumeSpecName: "config-data") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.208454 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-server-conf" (OuterVolumeSpecName: "server-conf") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.234529 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-9kzwv"] Feb 28 09:23:36 crc kubenswrapper[4687]: E0228 09:23:36.234956 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerName="init" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.234974 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerName="init" Feb 28 09:23:36 crc kubenswrapper[4687]: E0228 09:23:36.234997 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="rabbitmq" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.235002 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="rabbitmq" Feb 28 09:23:36 crc kubenswrapper[4687]: E0228 09:23:36.235013 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerName="dnsmasq-dns" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.235034 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerName="dnsmasq-dns" Feb 28 09:23:36 crc kubenswrapper[4687]: E0228 09:23:36.235046 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="setup-container" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.235052 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="setup-container" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.235247 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" containerName="rabbitmq" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.235268 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ba6f0a-8cc9-41a9-9444-5e338bd8a300" containerName="dnsmasq-dns" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.236231 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.238638 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241049 4687 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/541f5799-4b5e-4767-aca7-8c3738502a06-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241078 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241100 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241110 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241122 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp667\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-kube-api-access-jp667\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241130 4687 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/541f5799-4b5e-4767-aca7-8c3738502a06-pod-info\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241140 4687 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-server-conf\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241180 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241189 4687 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/541f5799-4b5e-4767-aca7-8c3738502a06-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.241199 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.246692 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-9kzwv"] Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.268199 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "541f5799-4b5e-4767-aca7-8c3738502a06" (UID: "541f5799-4b5e-4767-aca7-8c3738502a06"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.276719 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.342469 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-svc\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.342511 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.342697 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.342855 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.342957 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-config\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.343155 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7k7j\" (UniqueName: \"kubernetes.io/projected/7a8a36df-cd00-4365-80d3-7fae56073093-kube-api-access-m7k7j\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.343183 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.343453 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/541f5799-4b5e-4767-aca7-8c3738502a06-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.343471 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.445006 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.445107 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.445188 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.445287 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-config\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.445476 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7k7j\" (UniqueName: \"kubernetes.io/projected/7a8a36df-cd00-4365-80d3-7fae56073093-kube-api-access-m7k7j\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.445505 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.446053 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.446114 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.446229 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-config\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.446251 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.446575 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-svc\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.446780 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.447257 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-svc\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.459186 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7k7j\" (UniqueName: \"kubernetes.io/projected/7a8a36df-cd00-4365-80d3-7fae56073093-kube-api-access-m7k7j\") pod \"dnsmasq-dns-bfb45b47-9kzwv\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.557683 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.751929 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"541f5799-4b5e-4767-aca7-8c3738502a06","Type":"ContainerDied","Data":"206e2422ee3b551de75917d879a6617d4a05b1f456afc649e089e1537fea3d4c"} Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.752007 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.752273 4687 scope.go:117] "RemoveContainer" containerID="cc10b6b23a3eab63c3944f46eeb03c0ab55ae001902fe5a9f2a6bae319a6709d" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.786772 4687 scope.go:117] "RemoveContainer" containerID="fc6036d26129118d267b8cba85e86a89e8d8f4544e9f0c7c8c7911aa86fdebc9" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.793606 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.803003 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.814449 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.816319 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.820627 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.829564 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.829750 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-khs6z" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.841646 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.841818 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.841946 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.842113 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.842242 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855221 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0af13829-a7ca-4952-8e73-2923cc70ef98-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855256 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855281 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8cgw\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-kube-api-access-t8cgw\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855330 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855366 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0af13829-a7ca-4952-8e73-2923cc70ef98-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855383 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855409 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855430 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855453 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855489 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-config-data\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.855517 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.957956 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0af13829-a7ca-4952-8e73-2923cc70ef98-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.957996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958042 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8cgw\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-kube-api-access-t8cgw\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958100 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958131 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0af13829-a7ca-4952-8e73-2923cc70ef98-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958147 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958171 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958196 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958222 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958238 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-config-data\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958263 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.958611 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.959492 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.959777 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.960013 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.960707 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.961387 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0af13829-a7ca-4952-8e73-2923cc70ef98-config-data\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.968612 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0af13829-a7ca-4952-8e73-2923cc70ef98-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.970481 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0af13829-a7ca-4952-8e73-2923cc70ef98-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.971005 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.974314 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:36 crc kubenswrapper[4687]: I0228 09:23:36.981215 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8cgw\" (UniqueName: \"kubernetes.io/projected/0af13829-a7ca-4952-8e73-2923cc70ef98-kube-api-access-t8cgw\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.004583 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"0af13829-a7ca-4952-8e73-2923cc70ef98\") " pod="openstack/rabbitmq-server-0" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.030928 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-9kzwv"] Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.230259 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.306311 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473653 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-confd\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473722 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/171eb8fe-deaf-4936-b51d-de02b4131b8b-erlang-cookie-secret\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473761 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-tls\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473784 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473824 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-plugins-conf\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473856 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/171eb8fe-deaf-4936-b51d-de02b4131b8b-pod-info\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.473887 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-config-data\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.474043 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv7rs\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-kube-api-access-xv7rs\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.474183 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-erlang-cookie\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.474259 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-server-conf\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.474289 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-plugins\") pod \"171eb8fe-deaf-4936-b51d-de02b4131b8b\" (UID: \"171eb8fe-deaf-4936-b51d-de02b4131b8b\") " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.475720 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.475837 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.476214 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.488917 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-kube-api-access-xv7rs" (OuterVolumeSpecName: "kube-api-access-xv7rs") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "kube-api-access-xv7rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.496168 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.509188 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/171eb8fe-deaf-4936-b51d-de02b4131b8b-pod-info" (OuterVolumeSpecName: "pod-info") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.509346 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.516119 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171eb8fe-deaf-4936-b51d-de02b4131b8b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.549899 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-config-data" (OuterVolumeSpecName: "config-data") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580467 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580508 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580518 4687 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/171eb8fe-deaf-4936-b51d-de02b4131b8b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580527 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580559 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580572 4687 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580581 4687 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/171eb8fe-deaf-4936-b51d-de02b4131b8b-pod-info\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580589 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.580597 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv7rs\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-kube-api-access-xv7rs\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.587369 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-server-conf" (OuterVolumeSpecName: "server-conf") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.610602 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.689763 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.689795 4687 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/171eb8fe-deaf-4936-b51d-de02b4131b8b-server-conf\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.702245 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "171eb8fe-deaf-4936-b51d-de02b4131b8b" (UID: "171eb8fe-deaf-4936-b51d-de02b4131b8b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.761606 4687 generic.go:334] "Generic (PLEG): container finished" podID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerID="96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1" exitCode=0 Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.761671 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"171eb8fe-deaf-4936-b51d-de02b4131b8b","Type":"ContainerDied","Data":"96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1"} Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.761704 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"171eb8fe-deaf-4936-b51d-de02b4131b8b","Type":"ContainerDied","Data":"c835050cda7388df5c0329a89bc25e1c3f3497740cfe7d7c1128fb951745ab22"} Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.761724 4687 scope.go:117] "RemoveContainer" containerID="96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.761868 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.776428 4687 generic.go:334] "Generic (PLEG): container finished" podID="7a8a36df-cd00-4365-80d3-7fae56073093" containerID="3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d" exitCode=0 Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.776460 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" event={"ID":"7a8a36df-cd00-4365-80d3-7fae56073093","Type":"ContainerDied","Data":"3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d"} Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.776477 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" event={"ID":"7a8a36df-cd00-4365-80d3-7fae56073093","Type":"ContainerStarted","Data":"7f6c672f7e91450d43acb8ffef8b6d2a56d8ed0f9d025651cf14902498abf714"} Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.790460 4687 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/171eb8fe-deaf-4936-b51d-de02b4131b8b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.819582 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.900888 4687 scope.go:117] "RemoveContainer" containerID="6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.931764 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.948118 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.967510 4687 scope.go:117] "RemoveContainer" containerID="96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1" Feb 28 09:23:37 crc kubenswrapper[4687]: E0228 09:23:37.969014 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1\": container with ID starting with 96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1 not found: ID does not exist" containerID="96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.969066 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1"} err="failed to get container status \"96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1\": rpc error: code = NotFound desc = could not find container \"96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1\": container with ID starting with 96a5955dcccd771e543c70d22a60fc61d48e862846d9534debc8d49a460704c1 not found: ID does not exist" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.969101 4687 scope.go:117] "RemoveContainer" containerID="6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068" Feb 28 09:23:37 crc kubenswrapper[4687]: E0228 09:23:37.969569 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068\": container with ID starting with 6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068 not found: ID does not exist" containerID="6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.969596 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068"} err="failed to get container status \"6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068\": rpc error: code = NotFound desc = could not find container \"6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068\": container with ID starting with 6c7e5035e6c7381269e50141c66991933c97603d3e9469d3c92f79c4e27e4068 not found: ID does not exist" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.980795 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:23:37 crc kubenswrapper[4687]: E0228 09:23:37.981358 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerName="rabbitmq" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.981376 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerName="rabbitmq" Feb 28 09:23:37 crc kubenswrapper[4687]: E0228 09:23:37.981394 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerName="setup-container" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.981402 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerName="setup-container" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.981590 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" containerName="rabbitmq" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.982594 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.985008 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.985519 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.985651 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.985761 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lv99p" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.986062 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.986103 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.986388 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 28 09:23:37 crc kubenswrapper[4687]: I0228 09:23:37.996900 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098256 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098324 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098369 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098572 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098714 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098769 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098851 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.098964 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.099057 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.099121 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stnbs\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-kube-api-access-stnbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.099299 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201636 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201705 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201739 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201771 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201811 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201845 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201873 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stnbs\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-kube-api-access-stnbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201934 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201963 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.201994 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.202047 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.202654 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.203050 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.203266 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.203267 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.203410 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.203432 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.207812 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.208710 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.209170 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.209716 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.219471 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stnbs\" (UniqueName: \"kubernetes.io/projected/02945b48-0d0e-4c7c-8247-7b3060a6fc3c-kube-api-access-stnbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.232983 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"02945b48-0d0e-4c7c-8247-7b3060a6fc3c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.306783 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.668233 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171eb8fe-deaf-4936-b51d-de02b4131b8b" path="/var/lib/kubelet/pods/171eb8fe-deaf-4936-b51d-de02b4131b8b/volumes" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.669467 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541f5799-4b5e-4767-aca7-8c3738502a06" path="/var/lib/kubelet/pods/541f5799-4b5e-4767-aca7-8c3738502a06/volumes" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.735129 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.785951 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"02945b48-0d0e-4c7c-8247-7b3060a6fc3c","Type":"ContainerStarted","Data":"f1e6e8b74758799b261f3d6741e806a6dc79ab91aaad4de2a150ef526e4483ab"} Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.789125 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0af13829-a7ca-4952-8e73-2923cc70ef98","Type":"ContainerStarted","Data":"856fb44d849b6e59217563363467cdcf8a10c2132d9d789d3fa960bd8160fb61"} Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.791368 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" event={"ID":"7a8a36df-cd00-4365-80d3-7fae56073093","Type":"ContainerStarted","Data":"db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6"} Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.791491 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:38 crc kubenswrapper[4687]: I0228 09:23:38.815469 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" podStartSLOduration=2.815458425 podStartE2EDuration="2.815458425s" podCreationTimestamp="2026-02-28 09:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:23:38.806373552 +0000 UTC m=+1210.496942899" watchObservedRunningTime="2026-02-28 09:23:38.815458425 +0000 UTC m=+1210.506027762" Feb 28 09:23:39 crc kubenswrapper[4687]: I0228 09:23:39.803558 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0af13829-a7ca-4952-8e73-2923cc70ef98","Type":"ContainerStarted","Data":"7dbd9bdb6bc265fceae38b3fb901460bab95692269bb079bfeaf825300ac9b1d"} Feb 28 09:23:40 crc kubenswrapper[4687]: I0228 09:23:40.816151 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"02945b48-0d0e-4c7c-8247-7b3060a6fc3c","Type":"ContainerStarted","Data":"6f4b19fcc014f00cba3d9101d605a555767265534c32d943b20966c26e2cdcb9"} Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.560145 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.613821 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-67mrm"] Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.614070 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-67mrm" podUID="d393de87-edb5-4ebd-986b-2857110b1706" containerName="dnsmasq-dns" containerID="cri-o://7bc95be36b18ee7280ad5ee6d217184930449b81e6794a3ec32cf140c198b50e" gracePeriod=10 Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.730862 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-9dbr2"] Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.734418 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.760398 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-9dbr2"] Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.804042 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.804322 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jj4q\" (UniqueName: \"kubernetes.io/projected/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-kube-api-access-6jj4q\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.804430 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.804661 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-config\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.804781 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.804867 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.805054 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.874414 4687 generic.go:334] "Generic (PLEG): container finished" podID="d393de87-edb5-4ebd-986b-2857110b1706" containerID="7bc95be36b18ee7280ad5ee6d217184930449b81e6794a3ec32cf140c198b50e" exitCode=0 Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.874564 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-67mrm" event={"ID":"d393de87-edb5-4ebd-986b-2857110b1706","Type":"ContainerDied","Data":"7bc95be36b18ee7280ad5ee6d217184930449b81e6794a3ec32cf140c198b50e"} Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911283 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jj4q\" (UniqueName: \"kubernetes.io/projected/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-kube-api-access-6jj4q\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911330 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911409 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-config\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911451 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911481 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911557 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.911695 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.913000 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.913038 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.913395 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.913642 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.913737 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-config\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.913997 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:46 crc kubenswrapper[4687]: I0228 09:23:46.933740 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jj4q\" (UniqueName: \"kubernetes.io/projected/a8f2c1ae-1407-4d58-86af-05f1f1311d1a-kube-api-access-6jj4q\") pod \"dnsmasq-dns-79fcc958f9-9dbr2\" (UID: \"a8f2c1ae-1407-4d58-86af-05f1f1311d1a\") " pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.098045 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.107035 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.224159 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-swift-storage-0\") pod \"d393de87-edb5-4ebd-986b-2857110b1706\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.224528 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-nb\") pod \"d393de87-edb5-4ebd-986b-2857110b1706\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.224721 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-config\") pod \"d393de87-edb5-4ebd-986b-2857110b1706\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.224927 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjqkj\" (UniqueName: \"kubernetes.io/projected/d393de87-edb5-4ebd-986b-2857110b1706-kube-api-access-xjqkj\") pod \"d393de87-edb5-4ebd-986b-2857110b1706\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.224996 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-svc\") pod \"d393de87-edb5-4ebd-986b-2857110b1706\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.225194 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-sb\") pod \"d393de87-edb5-4ebd-986b-2857110b1706\" (UID: \"d393de87-edb5-4ebd-986b-2857110b1706\") " Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.232725 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d393de87-edb5-4ebd-986b-2857110b1706-kube-api-access-xjqkj" (OuterVolumeSpecName: "kube-api-access-xjqkj") pod "d393de87-edb5-4ebd-986b-2857110b1706" (UID: "d393de87-edb5-4ebd-986b-2857110b1706"). InnerVolumeSpecName "kube-api-access-xjqkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.275058 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d393de87-edb5-4ebd-986b-2857110b1706" (UID: "d393de87-edb5-4ebd-986b-2857110b1706"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.281576 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-config" (OuterVolumeSpecName: "config") pod "d393de87-edb5-4ebd-986b-2857110b1706" (UID: "d393de87-edb5-4ebd-986b-2857110b1706"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.295607 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d393de87-edb5-4ebd-986b-2857110b1706" (UID: "d393de87-edb5-4ebd-986b-2857110b1706"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.296216 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d393de87-edb5-4ebd-986b-2857110b1706" (UID: "d393de87-edb5-4ebd-986b-2857110b1706"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.299408 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d393de87-edb5-4ebd-986b-2857110b1706" (UID: "d393de87-edb5-4ebd-986b-2857110b1706"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.329042 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.329068 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjqkj\" (UniqueName: \"kubernetes.io/projected/d393de87-edb5-4ebd-986b-2857110b1706-kube-api-access-xjqkj\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.329090 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.329102 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.329114 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.329124 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d393de87-edb5-4ebd-986b-2857110b1706-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.557423 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-9dbr2"] Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.890790 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-67mrm" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.890747 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-67mrm" event={"ID":"d393de87-edb5-4ebd-986b-2857110b1706","Type":"ContainerDied","Data":"c4797f5ebd943aef8bb63af5b32b7b9f9b651aaa2e9f37ebebd76f92975ebc80"} Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.891338 4687 scope.go:117] "RemoveContainer" containerID="7bc95be36b18ee7280ad5ee6d217184930449b81e6794a3ec32cf140c198b50e" Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.892614 4687 generic.go:334] "Generic (PLEG): container finished" podID="a8f2c1ae-1407-4d58-86af-05f1f1311d1a" containerID="9abbe221a7654beb8530fc83853aa47ab0c1c7d7e65c8be6a134fdfcdd2a4602" exitCode=0 Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.892659 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" event={"ID":"a8f2c1ae-1407-4d58-86af-05f1f1311d1a","Type":"ContainerDied","Data":"9abbe221a7654beb8530fc83853aa47ab0c1c7d7e65c8be6a134fdfcdd2a4602"} Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.892689 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" event={"ID":"a8f2c1ae-1407-4d58-86af-05f1f1311d1a","Type":"ContainerStarted","Data":"cf3d754b6ae12afa35ead9f2b85fc970156ec40859d7b3a7df2bb8c03cd497e9"} Feb 28 09:23:47 crc kubenswrapper[4687]: I0228 09:23:47.914807 4687 scope.go:117] "RemoveContainer" containerID="62974ac7a3bb4f4fbf6aacafa370e1439a62488eb145b09d0e3ac042be13443d" Feb 28 09:23:48 crc kubenswrapper[4687]: I0228 09:23:48.100901 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-67mrm"] Feb 28 09:23:48 crc kubenswrapper[4687]: I0228 09:23:48.106761 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-67mrm"] Feb 28 09:23:48 crc kubenswrapper[4687]: I0228 09:23:48.665344 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d393de87-edb5-4ebd-986b-2857110b1706" path="/var/lib/kubelet/pods/d393de87-edb5-4ebd-986b-2857110b1706/volumes" Feb 28 09:23:48 crc kubenswrapper[4687]: I0228 09:23:48.907623 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" event={"ID":"a8f2c1ae-1407-4d58-86af-05f1f1311d1a","Type":"ContainerStarted","Data":"28cc7d744958ea9faeae1271897b79a120fbc82fbb7120c7522da5b13eccc40f"} Feb 28 09:23:48 crc kubenswrapper[4687]: I0228 09:23:48.907766 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:48 crc kubenswrapper[4687]: I0228 09:23:48.940358 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" podStartSLOduration=2.940343741 podStartE2EDuration="2.940343741s" podCreationTimestamp="2026-02-28 09:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:23:48.935722471 +0000 UTC m=+1220.626291808" watchObservedRunningTime="2026-02-28 09:23:48.940343741 +0000 UTC m=+1220.630913077" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.109432 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79fcc958f9-9dbr2" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.162556 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-9kzwv"] Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.164627 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" containerName="dnsmasq-dns" containerID="cri-o://db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6" gracePeriod=10 Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.588655 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.646192 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-sb\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.646257 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-swift-storage-0\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.646562 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7k7j\" (UniqueName: \"kubernetes.io/projected/7a8a36df-cd00-4365-80d3-7fae56073093-kube-api-access-m7k7j\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.646594 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-nb\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.653355 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8a36df-cd00-4365-80d3-7fae56073093-kube-api-access-m7k7j" (OuterVolumeSpecName: "kube-api-access-m7k7j") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "kube-api-access-m7k7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.695343 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.700379 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.703376 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749057 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-svc\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749140 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-config\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749300 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-openstack-edpm-ipam\") pod \"7a8a36df-cd00-4365-80d3-7fae56073093\" (UID: \"7a8a36df-cd00-4365-80d3-7fae56073093\") " Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749830 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7k7j\" (UniqueName: \"kubernetes.io/projected/7a8a36df-cd00-4365-80d3-7fae56073093-kube-api-access-m7k7j\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749849 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749857 4687 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.749870 4687 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.780103 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.785503 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.789117 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-config" (OuterVolumeSpecName: "config") pod "7a8a36df-cd00-4365-80d3-7fae56073093" (UID: "7a8a36df-cd00-4365-80d3-7fae56073093"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.852115 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.852157 4687 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:57 crc kubenswrapper[4687]: I0228 09:23:57.852169 4687 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8a36df-cd00-4365-80d3-7fae56073093-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.003115 4687 generic.go:334] "Generic (PLEG): container finished" podID="7a8a36df-cd00-4365-80d3-7fae56073093" containerID="db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6" exitCode=0 Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.003373 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" event={"ID":"7a8a36df-cd00-4365-80d3-7fae56073093","Type":"ContainerDied","Data":"db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6"} Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.003654 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" event={"ID":"7a8a36df-cd00-4365-80d3-7fae56073093","Type":"ContainerDied","Data":"7f6c672f7e91450d43acb8ffef8b6d2a56d8ed0f9d025651cf14902498abf714"} Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.003691 4687 scope.go:117] "RemoveContainer" containerID="db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.003525 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-9kzwv" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.035095 4687 scope.go:117] "RemoveContainer" containerID="3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.042829 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-9kzwv"] Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.048523 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-9kzwv"] Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.058965 4687 scope.go:117] "RemoveContainer" containerID="db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6" Feb 28 09:23:58 crc kubenswrapper[4687]: E0228 09:23:58.059483 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6\": container with ID starting with db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6 not found: ID does not exist" containerID="db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.059524 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6"} err="failed to get container status \"db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6\": rpc error: code = NotFound desc = could not find container \"db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6\": container with ID starting with db2889ec6fd14f9f8ace84d681615867d4a9f4c582aabe8684e05e4c7f30b0c6 not found: ID does not exist" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.059552 4687 scope.go:117] "RemoveContainer" containerID="3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d" Feb 28 09:23:58 crc kubenswrapper[4687]: E0228 09:23:58.059854 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d\": container with ID starting with 3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d not found: ID does not exist" containerID="3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.059882 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d"} err="failed to get container status \"3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d\": rpc error: code = NotFound desc = could not find container \"3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d\": container with ID starting with 3c2bb466347ee5007e34edf0194f9224104169393429688efa76b54548af7c3d not found: ID does not exist" Feb 28 09:23:58 crc kubenswrapper[4687]: I0228 09:23:58.671593 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" path="/var/lib/kubelet/pods/7a8a36df-cd00-4365-80d3-7fae56073093/volumes" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.144218 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537844-nhrcz"] Feb 28 09:24:00 crc kubenswrapper[4687]: E0228 09:24:00.144878 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d393de87-edb5-4ebd-986b-2857110b1706" containerName="init" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.144892 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d393de87-edb5-4ebd-986b-2857110b1706" containerName="init" Feb 28 09:24:00 crc kubenswrapper[4687]: E0228 09:24:00.144904 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" containerName="init" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.144910 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" containerName="init" Feb 28 09:24:00 crc kubenswrapper[4687]: E0228 09:24:00.144920 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d393de87-edb5-4ebd-986b-2857110b1706" containerName="dnsmasq-dns" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.144926 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d393de87-edb5-4ebd-986b-2857110b1706" containerName="dnsmasq-dns" Feb 28 09:24:00 crc kubenswrapper[4687]: E0228 09:24:00.144941 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" containerName="dnsmasq-dns" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.144946 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" containerName="dnsmasq-dns" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.145137 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d393de87-edb5-4ebd-986b-2857110b1706" containerName="dnsmasq-dns" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.145149 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8a36df-cd00-4365-80d3-7fae56073093" containerName="dnsmasq-dns" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.145716 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.147668 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.147905 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.149880 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.163155 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537844-nhrcz"] Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.301871 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5k7v\" (UniqueName: \"kubernetes.io/projected/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a-kube-api-access-h5k7v\") pod \"auto-csr-approver-29537844-nhrcz\" (UID: \"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a\") " pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.403911 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5k7v\" (UniqueName: \"kubernetes.io/projected/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a-kube-api-access-h5k7v\") pod \"auto-csr-approver-29537844-nhrcz\" (UID: \"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a\") " pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.425230 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5k7v\" (UniqueName: \"kubernetes.io/projected/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a-kube-api-access-h5k7v\") pod \"auto-csr-approver-29537844-nhrcz\" (UID: \"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a\") " pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.463234 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:00 crc kubenswrapper[4687]: I0228 09:24:00.881546 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537844-nhrcz"] Feb 28 09:24:00 crc kubenswrapper[4687]: W0228 09:24:00.889273 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab0c3dcc_aa8f_41f1_8014_05bf76455d2a.slice/crio-d14c7aba3892ad18bc1d3bf41ff2974b0ad37f15872ad3b9cf5a2c9276bec7f2 WatchSource:0}: Error finding container d14c7aba3892ad18bc1d3bf41ff2974b0ad37f15872ad3b9cf5a2c9276bec7f2: Status 404 returned error can't find the container with id d14c7aba3892ad18bc1d3bf41ff2974b0ad37f15872ad3b9cf5a2c9276bec7f2 Feb 28 09:24:01 crc kubenswrapper[4687]: I0228 09:24:01.036499 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" event={"ID":"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a","Type":"ContainerStarted","Data":"d14c7aba3892ad18bc1d3bf41ff2974b0ad37f15872ad3b9cf5a2c9276bec7f2"} Feb 28 09:24:02 crc kubenswrapper[4687]: I0228 09:24:02.057318 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" event={"ID":"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a","Type":"ContainerStarted","Data":"009568f75339d1a8c3c9123c3946d8b90dfafcad46f65b2258c839ba6da203dd"} Feb 28 09:24:02 crc kubenswrapper[4687]: I0228 09:24:02.079993 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" podStartSLOduration=1.238308748 podStartE2EDuration="2.079972528s" podCreationTimestamp="2026-02-28 09:24:00 +0000 UTC" firstStartedPulling="2026-02-28 09:24:00.893796476 +0000 UTC m=+1232.584365814" lastFinishedPulling="2026-02-28 09:24:01.735460257 +0000 UTC m=+1233.426029594" observedRunningTime="2026-02-28 09:24:02.072560058 +0000 UTC m=+1233.763129396" watchObservedRunningTime="2026-02-28 09:24:02.079972528 +0000 UTC m=+1233.770541864" Feb 28 09:24:03 crc kubenswrapper[4687]: I0228 09:24:03.073562 4687 generic.go:334] "Generic (PLEG): container finished" podID="ab0c3dcc-aa8f-41f1-8014-05bf76455d2a" containerID="009568f75339d1a8c3c9123c3946d8b90dfafcad46f65b2258c839ba6da203dd" exitCode=0 Feb 28 09:24:03 crc kubenswrapper[4687]: I0228 09:24:03.073681 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" event={"ID":"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a","Type":"ContainerDied","Data":"009568f75339d1a8c3c9123c3946d8b90dfafcad46f65b2258c839ba6da203dd"} Feb 28 09:24:04 crc kubenswrapper[4687]: I0228 09:24:04.371219 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:04 crc kubenswrapper[4687]: I0228 09:24:04.499160 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5k7v\" (UniqueName: \"kubernetes.io/projected/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a-kube-api-access-h5k7v\") pod \"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a\" (UID: \"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a\") " Feb 28 09:24:04 crc kubenswrapper[4687]: I0228 09:24:04.507733 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a-kube-api-access-h5k7v" (OuterVolumeSpecName: "kube-api-access-h5k7v") pod "ab0c3dcc-aa8f-41f1-8014-05bf76455d2a" (UID: "ab0c3dcc-aa8f-41f1-8014-05bf76455d2a"). InnerVolumeSpecName "kube-api-access-h5k7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:24:04 crc kubenswrapper[4687]: I0228 09:24:04.603593 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5k7v\" (UniqueName: \"kubernetes.io/projected/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a-kube-api-access-h5k7v\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:05 crc kubenswrapper[4687]: I0228 09:24:05.095970 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" event={"ID":"ab0c3dcc-aa8f-41f1-8014-05bf76455d2a","Type":"ContainerDied","Data":"d14c7aba3892ad18bc1d3bf41ff2974b0ad37f15872ad3b9cf5a2c9276bec7f2"} Feb 28 09:24:05 crc kubenswrapper[4687]: I0228 09:24:05.096470 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d14c7aba3892ad18bc1d3bf41ff2974b0ad37f15872ad3b9cf5a2c9276bec7f2" Feb 28 09:24:05 crc kubenswrapper[4687]: I0228 09:24:05.096076 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537844-nhrcz" Feb 28 09:24:05 crc kubenswrapper[4687]: I0228 09:24:05.135283 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537838-tpskf"] Feb 28 09:24:05 crc kubenswrapper[4687]: I0228 09:24:05.144598 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537838-tpskf"] Feb 28 09:24:06 crc kubenswrapper[4687]: I0228 09:24:06.667147 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="431f48d9-5c93-4c3e-b2e4-bdb74b8945e3" path="/var/lib/kubelet/pods/431f48d9-5c93-4c3e-b2e4-bdb74b8945e3/volumes" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.144316 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd"] Feb 28 09:24:10 crc kubenswrapper[4687]: E0228 09:24:10.145084 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c3dcc-aa8f-41f1-8014-05bf76455d2a" containerName="oc" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.145107 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c3dcc-aa8f-41f1-8014-05bf76455d2a" containerName="oc" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.145302 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c3dcc-aa8f-41f1-8014-05bf76455d2a" containerName="oc" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.145981 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.147805 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.151796 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.151958 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.155497 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.164403 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd"] Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.327905 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.328577 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2sr\" (UniqueName: \"kubernetes.io/projected/bb39766e-6294-4141-be47-7a7085460449-kube-api-access-5m2sr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.328788 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.328827 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.431671 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2sr\" (UniqueName: \"kubernetes.io/projected/bb39766e-6294-4141-be47-7a7085460449-kube-api-access-5m2sr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.431825 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.431860 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.431946 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.439363 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.439955 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.440068 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.445799 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2sr\" (UniqueName: \"kubernetes.io/projected/bb39766e-6294-4141-be47-7a7085460449-kube-api-access-5m2sr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.462610 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:10 crc kubenswrapper[4687]: I0228 09:24:10.982117 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd"] Feb 28 09:24:10 crc kubenswrapper[4687]: W0228 09:24:10.984439 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb39766e_6294_4141_be47_7a7085460449.slice/crio-a125cb894061ec84db71eb7e18e4ea6dc4809e1afd3c97ab39bd729e70eafaeb WatchSource:0}: Error finding container a125cb894061ec84db71eb7e18e4ea6dc4809e1afd3c97ab39bd729e70eafaeb: Status 404 returned error can't find the container with id a125cb894061ec84db71eb7e18e4ea6dc4809e1afd3c97ab39bd729e70eafaeb Feb 28 09:24:11 crc kubenswrapper[4687]: I0228 09:24:11.168189 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" event={"ID":"bb39766e-6294-4141-be47-7a7085460449","Type":"ContainerStarted","Data":"a125cb894061ec84db71eb7e18e4ea6dc4809e1afd3c97ab39bd729e70eafaeb"} Feb 28 09:24:12 crc kubenswrapper[4687]: I0228 09:24:12.205350 4687 generic.go:334] "Generic (PLEG): container finished" podID="02945b48-0d0e-4c7c-8247-7b3060a6fc3c" containerID="6f4b19fcc014f00cba3d9101d605a555767265534c32d943b20966c26e2cdcb9" exitCode=0 Feb 28 09:24:12 crc kubenswrapper[4687]: I0228 09:24:12.205433 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"02945b48-0d0e-4c7c-8247-7b3060a6fc3c","Type":"ContainerDied","Data":"6f4b19fcc014f00cba3d9101d605a555767265534c32d943b20966c26e2cdcb9"} Feb 28 09:24:12 crc kubenswrapper[4687]: I0228 09:24:12.228201 4687 generic.go:334] "Generic (PLEG): container finished" podID="0af13829-a7ca-4952-8e73-2923cc70ef98" containerID="7dbd9bdb6bc265fceae38b3fb901460bab95692269bb079bfeaf825300ac9b1d" exitCode=0 Feb 28 09:24:12 crc kubenswrapper[4687]: I0228 09:24:12.228244 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0af13829-a7ca-4952-8e73-2923cc70ef98","Type":"ContainerDied","Data":"7dbd9bdb6bc265fceae38b3fb901460bab95692269bb079bfeaf825300ac9b1d"} Feb 28 09:24:13 crc kubenswrapper[4687]: I0228 09:24:13.241632 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"02945b48-0d0e-4c7c-8247-7b3060a6fc3c","Type":"ContainerStarted","Data":"7ae8d93d9e54f35235c6c5d181c74a0e7c0138f3c42d67e3e074d2c3c1e1782d"} Feb 28 09:24:13 crc kubenswrapper[4687]: I0228 09:24:13.242379 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:24:13 crc kubenswrapper[4687]: I0228 09:24:13.254910 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0af13829-a7ca-4952-8e73-2923cc70ef98","Type":"ContainerStarted","Data":"937de236786bb27a3bc0205946e49f3ce01d0bfa431cf7a26517da7114019484"} Feb 28 09:24:13 crc kubenswrapper[4687]: I0228 09:24:13.255120 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 28 09:24:13 crc kubenswrapper[4687]: I0228 09:24:13.270957 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.270936574 podStartE2EDuration="36.270936574s" podCreationTimestamp="2026-02-28 09:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:24:13.263156273 +0000 UTC m=+1244.953725620" watchObservedRunningTime="2026-02-28 09:24:13.270936574 +0000 UTC m=+1244.961505910" Feb 28 09:24:13 crc kubenswrapper[4687]: I0228 09:24:13.285192 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.285167923 podStartE2EDuration="37.285167923s" podCreationTimestamp="2026-02-28 09:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:24:13.281057544 +0000 UTC m=+1244.971626881" watchObservedRunningTime="2026-02-28 09:24:13.285167923 +0000 UTC m=+1244.975737260" Feb 28 09:24:21 crc kubenswrapper[4687]: I0228 09:24:21.341878 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" event={"ID":"bb39766e-6294-4141-be47-7a7085460449","Type":"ContainerStarted","Data":"f442aad35412233717978d7aa7a0ffdf3ae5177a3fd5ae366428d0823002b2e0"} Feb 28 09:24:21 crc kubenswrapper[4687]: I0228 09:24:21.361238 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" podStartSLOduration=2.063947666 podStartE2EDuration="11.361218668s" podCreationTimestamp="2026-02-28 09:24:10 +0000 UTC" firstStartedPulling="2026-02-28 09:24:10.987056243 +0000 UTC m=+1242.677625580" lastFinishedPulling="2026-02-28 09:24:20.284327246 +0000 UTC m=+1251.974896582" observedRunningTime="2026-02-28 09:24:21.358133948 +0000 UTC m=+1253.048703284" watchObservedRunningTime="2026-02-28 09:24:21.361218668 +0000 UTC m=+1253.051788004" Feb 28 09:24:27 crc kubenswrapper[4687]: I0228 09:24:27.235262 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 28 09:24:28 crc kubenswrapper[4687]: I0228 09:24:28.310185 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 28 09:24:32 crc kubenswrapper[4687]: I0228 09:24:32.449273 4687 generic.go:334] "Generic (PLEG): container finished" podID="bb39766e-6294-4141-be47-7a7085460449" containerID="f442aad35412233717978d7aa7a0ffdf3ae5177a3fd5ae366428d0823002b2e0" exitCode=0 Feb 28 09:24:32 crc kubenswrapper[4687]: I0228 09:24:32.449363 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" event={"ID":"bb39766e-6294-4141-be47-7a7085460449","Type":"ContainerDied","Data":"f442aad35412233717978d7aa7a0ffdf3ae5177a3fd5ae366428d0823002b2e0"} Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.800197 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.816494 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-inventory\") pod \"bb39766e-6294-4141-be47-7a7085460449\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.816764 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-repo-setup-combined-ca-bundle\") pod \"bb39766e-6294-4141-be47-7a7085460449\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.816827 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-ssh-key-openstack-edpm-ipam\") pod \"bb39766e-6294-4141-be47-7a7085460449\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.816940 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m2sr\" (UniqueName: \"kubernetes.io/projected/bb39766e-6294-4141-be47-7a7085460449-kube-api-access-5m2sr\") pod \"bb39766e-6294-4141-be47-7a7085460449\" (UID: \"bb39766e-6294-4141-be47-7a7085460449\") " Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.825273 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bb39766e-6294-4141-be47-7a7085460449" (UID: "bb39766e-6294-4141-be47-7a7085460449"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.825707 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb39766e-6294-4141-be47-7a7085460449-kube-api-access-5m2sr" (OuterVolumeSpecName: "kube-api-access-5m2sr") pod "bb39766e-6294-4141-be47-7a7085460449" (UID: "bb39766e-6294-4141-be47-7a7085460449"). InnerVolumeSpecName "kube-api-access-5m2sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.848448 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bb39766e-6294-4141-be47-7a7085460449" (UID: "bb39766e-6294-4141-be47-7a7085460449"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.854370 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-inventory" (OuterVolumeSpecName: "inventory") pod "bb39766e-6294-4141-be47-7a7085460449" (UID: "bb39766e-6294-4141-be47-7a7085460449"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.919734 4687 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.919771 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.919784 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m2sr\" (UniqueName: \"kubernetes.io/projected/bb39766e-6294-4141-be47-7a7085460449-kube-api-access-5m2sr\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:33 crc kubenswrapper[4687]: I0228 09:24:33.919795 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb39766e-6294-4141-be47-7a7085460449-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.472929 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" event={"ID":"bb39766e-6294-4141-be47-7a7085460449","Type":"ContainerDied","Data":"a125cb894061ec84db71eb7e18e4ea6dc4809e1afd3c97ab39bd729e70eafaeb"} Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.472987 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a125cb894061ec84db71eb7e18e4ea6dc4809e1afd3c97ab39bd729e70eafaeb" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.473082 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.594366 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps"] Feb 28 09:24:34 crc kubenswrapper[4687]: E0228 09:24:34.595011 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb39766e-6294-4141-be47-7a7085460449" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.595049 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb39766e-6294-4141-be47-7a7085460449" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.595294 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb39766e-6294-4141-be47-7a7085460449" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.595950 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.597850 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.598248 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.598456 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.598759 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.610756 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps"] Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.740946 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.741282 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.741528 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktbv\" (UniqueName: \"kubernetes.io/projected/5a7981ec-8e60-4379-af52-5188e5b53dcf-kube-api-access-rktbv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.844117 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.844352 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktbv\" (UniqueName: \"kubernetes.io/projected/5a7981ec-8e60-4379-af52-5188e5b53dcf-kube-api-access-rktbv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.844464 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.851017 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.851269 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.859682 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktbv\" (UniqueName: \"kubernetes.io/projected/5a7981ec-8e60-4379-af52-5188e5b53dcf-kube-api-access-rktbv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lg8ps\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:34 crc kubenswrapper[4687]: I0228 09:24:34.912797 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:35 crc kubenswrapper[4687]: I0228 09:24:35.367414 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps"] Feb 28 09:24:35 crc kubenswrapper[4687]: I0228 09:24:35.484221 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" event={"ID":"5a7981ec-8e60-4379-af52-5188e5b53dcf","Type":"ContainerStarted","Data":"8994ea6dbcde1dce7885b02026444439542300759d2e42bc223fd0ca3765fa04"} Feb 28 09:24:36 crc kubenswrapper[4687]: I0228 09:24:36.495247 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" event={"ID":"5a7981ec-8e60-4379-af52-5188e5b53dcf","Type":"ContainerStarted","Data":"e0da12ebcdf513133cb34b099f793d57cb72ad564dde3fc5c16dcee52d552da4"} Feb 28 09:24:36 crc kubenswrapper[4687]: I0228 09:24:36.508748 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" podStartSLOduration=2.034226196 podStartE2EDuration="2.508728597s" podCreationTimestamp="2026-02-28 09:24:34 +0000 UTC" firstStartedPulling="2026-02-28 09:24:35.371493304 +0000 UTC m=+1267.062062641" lastFinishedPulling="2026-02-28 09:24:35.845995706 +0000 UTC m=+1267.536565042" observedRunningTime="2026-02-28 09:24:36.507717455 +0000 UTC m=+1268.198286792" watchObservedRunningTime="2026-02-28 09:24:36.508728597 +0000 UTC m=+1268.199297934" Feb 28 09:24:38 crc kubenswrapper[4687]: I0228 09:24:38.514890 4687 generic.go:334] "Generic (PLEG): container finished" podID="5a7981ec-8e60-4379-af52-5188e5b53dcf" containerID="e0da12ebcdf513133cb34b099f793d57cb72ad564dde3fc5c16dcee52d552da4" exitCode=0 Feb 28 09:24:38 crc kubenswrapper[4687]: I0228 09:24:38.514987 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" event={"ID":"5a7981ec-8e60-4379-af52-5188e5b53dcf","Type":"ContainerDied","Data":"e0da12ebcdf513133cb34b099f793d57cb72ad564dde3fc5c16dcee52d552da4"} Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.819691 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.954016 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-inventory\") pod \"5a7981ec-8e60-4379-af52-5188e5b53dcf\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.954174 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rktbv\" (UniqueName: \"kubernetes.io/projected/5a7981ec-8e60-4379-af52-5188e5b53dcf-kube-api-access-rktbv\") pod \"5a7981ec-8e60-4379-af52-5188e5b53dcf\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.954227 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-ssh-key-openstack-edpm-ipam\") pod \"5a7981ec-8e60-4379-af52-5188e5b53dcf\" (UID: \"5a7981ec-8e60-4379-af52-5188e5b53dcf\") " Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.960741 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7981ec-8e60-4379-af52-5188e5b53dcf-kube-api-access-rktbv" (OuterVolumeSpecName: "kube-api-access-rktbv") pod "5a7981ec-8e60-4379-af52-5188e5b53dcf" (UID: "5a7981ec-8e60-4379-af52-5188e5b53dcf"). InnerVolumeSpecName "kube-api-access-rktbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.978866 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a7981ec-8e60-4379-af52-5188e5b53dcf" (UID: "5a7981ec-8e60-4379-af52-5188e5b53dcf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:24:39 crc kubenswrapper[4687]: I0228 09:24:39.980224 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-inventory" (OuterVolumeSpecName: "inventory") pod "5a7981ec-8e60-4379-af52-5188e5b53dcf" (UID: "5a7981ec-8e60-4379-af52-5188e5b53dcf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.057014 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.057055 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rktbv\" (UniqueName: \"kubernetes.io/projected/5a7981ec-8e60-4379-af52-5188e5b53dcf-kube-api-access-rktbv\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.057069 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a7981ec-8e60-4379-af52-5188e5b53dcf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.534444 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" event={"ID":"5a7981ec-8e60-4379-af52-5188e5b53dcf","Type":"ContainerDied","Data":"8994ea6dbcde1dce7885b02026444439542300759d2e42bc223fd0ca3765fa04"} Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.534493 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8994ea6dbcde1dce7885b02026444439542300759d2e42bc223fd0ca3765fa04" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.534514 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lg8ps" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.586385 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls"] Feb 28 09:24:40 crc kubenswrapper[4687]: E0228 09:24:40.587131 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7981ec-8e60-4379-af52-5188e5b53dcf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.587155 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7981ec-8e60-4379-af52-5188e5b53dcf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.587377 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7981ec-8e60-4379-af52-5188e5b53dcf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.588071 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.589885 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.590259 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.590445 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.590586 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.597193 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls"] Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.669833 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7dln\" (UniqueName: \"kubernetes.io/projected/e607377f-9f4c-4f40-8d5c-17487eb054b8-kube-api-access-p7dln\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.669888 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.669949 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.669975 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.771122 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7dln\" (UniqueName: \"kubernetes.io/projected/e607377f-9f4c-4f40-8d5c-17487eb054b8-kube-api-access-p7dln\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.771190 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.771230 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.771248 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.776109 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.776437 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.777499 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.784045 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7dln\" (UniqueName: \"kubernetes.io/projected/e607377f-9f4c-4f40-8d5c-17487eb054b8-kube-api-access-p7dln\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:40 crc kubenswrapper[4687]: I0228 09:24:40.902921 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:24:41 crc kubenswrapper[4687]: I0228 09:24:41.263286 4687 scope.go:117] "RemoveContainer" containerID="43d44663884a598f4ca038e01e490f16d11416c524dc79aa1973d329172ede08" Feb 28 09:24:41 crc kubenswrapper[4687]: I0228 09:24:41.366931 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls"] Feb 28 09:24:41 crc kubenswrapper[4687]: I0228 09:24:41.543502 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" event={"ID":"e607377f-9f4c-4f40-8d5c-17487eb054b8","Type":"ContainerStarted","Data":"58816113590aea9b44456eaf4228a0791819e48a9db0989c738030537185249e"} Feb 28 09:24:42 crc kubenswrapper[4687]: I0228 09:24:42.552800 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" event={"ID":"e607377f-9f4c-4f40-8d5c-17487eb054b8","Type":"ContainerStarted","Data":"39d1f91383baf2a1f96e78d052edf975fce634d275a2d27c013199a926cf47b8"} Feb 28 09:24:42 crc kubenswrapper[4687]: I0228 09:24:42.567963 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" podStartSLOduration=2.123012143 podStartE2EDuration="2.56794872s" podCreationTimestamp="2026-02-28 09:24:40 +0000 UTC" firstStartedPulling="2026-02-28 09:24:41.373751897 +0000 UTC m=+1273.064321234" lastFinishedPulling="2026-02-28 09:24:41.818688474 +0000 UTC m=+1273.509257811" observedRunningTime="2026-02-28 09:24:42.5654651 +0000 UTC m=+1274.256034447" watchObservedRunningTime="2026-02-28 09:24:42.56794872 +0000 UTC m=+1274.258518057" Feb 28 09:24:55 crc kubenswrapper[4687]: I0228 09:24:55.002918 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:24:55 crc kubenswrapper[4687]: I0228 09:24:55.003781 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:25:25 crc kubenswrapper[4687]: I0228 09:25:25.002537 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:25:25 crc kubenswrapper[4687]: I0228 09:25:25.002998 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:25:41 crc kubenswrapper[4687]: I0228 09:25:41.350444 4687 scope.go:117] "RemoveContainer" containerID="c15344c70aef7423dfc2e08971dd45b9339c447cf7155ff2fb15d14bf09fdc1d" Feb 28 09:25:41 crc kubenswrapper[4687]: I0228 09:25:41.378831 4687 scope.go:117] "RemoveContainer" containerID="3e5bb00299b86f3ec0a992669cd92b2f35aa21ca0d58a4937ad5cb5f0b571e63" Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.002809 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.003297 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.003343 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.003805 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26defbc0a15ba55a0f8e3a7678fa01c73c7ea2162c34ef63cf8b44425106ed7e"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.003861 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://26defbc0a15ba55a0f8e3a7678fa01c73c7ea2162c34ef63cf8b44425106ed7e" gracePeriod=600 Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.216159 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="26defbc0a15ba55a0f8e3a7678fa01c73c7ea2162c34ef63cf8b44425106ed7e" exitCode=0 Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.216250 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"26defbc0a15ba55a0f8e3a7678fa01c73c7ea2162c34ef63cf8b44425106ed7e"} Feb 28 09:25:55 crc kubenswrapper[4687]: I0228 09:25:55.216391 4687 scope.go:117] "RemoveContainer" containerID="70e6449ca6d918497ca91c82bcac17a1011e8ea5698b1bdf893e712bee9903d3" Feb 28 09:25:56 crc kubenswrapper[4687]: I0228 09:25:56.227833 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f"} Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.136960 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537846-hkvwm"] Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.138606 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.140152 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7v6b\" (UniqueName: \"kubernetes.io/projected/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431-kube-api-access-l7v6b\") pod \"auto-csr-approver-29537846-hkvwm\" (UID: \"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431\") " pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.140183 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.140354 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.144623 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537846-hkvwm"] Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.145640 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.241783 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7v6b\" (UniqueName: \"kubernetes.io/projected/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431-kube-api-access-l7v6b\") pod \"auto-csr-approver-29537846-hkvwm\" (UID: \"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431\") " pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.258233 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7v6b\" (UniqueName: \"kubernetes.io/projected/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431-kube-api-access-l7v6b\") pod \"auto-csr-approver-29537846-hkvwm\" (UID: \"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431\") " pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.452190 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:00 crc kubenswrapper[4687]: I0228 09:26:00.827585 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537846-hkvwm"] Feb 28 09:26:00 crc kubenswrapper[4687]: W0228 09:26:00.829965 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d025ea4_23bc_45b7_b5c3_4f35b3d9d431.slice/crio-5f45cc38bbd46c5c41847e7ada77d55b9dd241ad7424d1c50924ff2d8299d008 WatchSource:0}: Error finding container 5f45cc38bbd46c5c41847e7ada77d55b9dd241ad7424d1c50924ff2d8299d008: Status 404 returned error can't find the container with id 5f45cc38bbd46c5c41847e7ada77d55b9dd241ad7424d1c50924ff2d8299d008 Feb 28 09:26:01 crc kubenswrapper[4687]: I0228 09:26:01.268427 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" event={"ID":"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431","Type":"ContainerStarted","Data":"5f45cc38bbd46c5c41847e7ada77d55b9dd241ad7424d1c50924ff2d8299d008"} Feb 28 09:26:02 crc kubenswrapper[4687]: E0228 09:26:02.081795 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d025ea4_23bc_45b7_b5c3_4f35b3d9d431.slice/crio-conmon-df80ab0853ae48c8725369755041e05ead28d3cc7e78bd001524e8958836fba5.scope\": RecentStats: unable to find data in memory cache]" Feb 28 09:26:02 crc kubenswrapper[4687]: I0228 09:26:02.279623 4687 generic.go:334] "Generic (PLEG): container finished" podID="3d025ea4-23bc-45b7-b5c3-4f35b3d9d431" containerID="df80ab0853ae48c8725369755041e05ead28d3cc7e78bd001524e8958836fba5" exitCode=0 Feb 28 09:26:02 crc kubenswrapper[4687]: I0228 09:26:02.279733 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" event={"ID":"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431","Type":"ContainerDied","Data":"df80ab0853ae48c8725369755041e05ead28d3cc7e78bd001524e8958836fba5"} Feb 28 09:26:03 crc kubenswrapper[4687]: I0228 09:26:03.532913 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:03 crc kubenswrapper[4687]: I0228 09:26:03.700879 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7v6b\" (UniqueName: \"kubernetes.io/projected/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431-kube-api-access-l7v6b\") pod \"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431\" (UID: \"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431\") " Feb 28 09:26:03 crc kubenswrapper[4687]: I0228 09:26:03.706286 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431-kube-api-access-l7v6b" (OuterVolumeSpecName: "kube-api-access-l7v6b") pod "3d025ea4-23bc-45b7-b5c3-4f35b3d9d431" (UID: "3d025ea4-23bc-45b7-b5c3-4f35b3d9d431"). InnerVolumeSpecName "kube-api-access-l7v6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:26:03 crc kubenswrapper[4687]: I0228 09:26:03.803861 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7v6b\" (UniqueName: \"kubernetes.io/projected/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431-kube-api-access-l7v6b\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:04 crc kubenswrapper[4687]: I0228 09:26:04.296080 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" event={"ID":"3d025ea4-23bc-45b7-b5c3-4f35b3d9d431","Type":"ContainerDied","Data":"5f45cc38bbd46c5c41847e7ada77d55b9dd241ad7424d1c50924ff2d8299d008"} Feb 28 09:26:04 crc kubenswrapper[4687]: I0228 09:26:04.296334 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f45cc38bbd46c5c41847e7ada77d55b9dd241ad7424d1c50924ff2d8299d008" Feb 28 09:26:04 crc kubenswrapper[4687]: I0228 09:26:04.296147 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537846-hkvwm" Feb 28 09:26:04 crc kubenswrapper[4687]: I0228 09:26:04.586380 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537840-2wf4q"] Feb 28 09:26:04 crc kubenswrapper[4687]: I0228 09:26:04.592188 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537840-2wf4q"] Feb 28 09:26:04 crc kubenswrapper[4687]: I0228 09:26:04.664655 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694d7626-7d52-4f55-a8c3-79feaec0e5e2" path="/var/lib/kubelet/pods/694d7626-7d52-4f55-a8c3-79feaec0e5e2/volumes" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.277359 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rlf7f"] Feb 28 09:26:20 crc kubenswrapper[4687]: E0228 09:26:20.279304 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d025ea4-23bc-45b7-b5c3-4f35b3d9d431" containerName="oc" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.279437 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d025ea4-23bc-45b7-b5c3-4f35b3d9d431" containerName="oc" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.280061 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d025ea4-23bc-45b7-b5c3-4f35b3d9d431" containerName="oc" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.283061 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.294239 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-utilities\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.294605 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-catalog-content\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.294750 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtrdb\" (UniqueName: \"kubernetes.io/projected/6a5ffade-2391-4290-852c-4058a3a63a20-kube-api-access-mtrdb\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.303623 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlf7f"] Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.395752 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-catalog-content\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.395823 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtrdb\" (UniqueName: \"kubernetes.io/projected/6a5ffade-2391-4290-852c-4058a3a63a20-kube-api-access-mtrdb\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.395850 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-utilities\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.396322 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-catalog-content\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.396345 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-utilities\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.414410 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtrdb\" (UniqueName: \"kubernetes.io/projected/6a5ffade-2391-4290-852c-4058a3a63a20-kube-api-access-mtrdb\") pod \"redhat-operators-rlf7f\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:20 crc kubenswrapper[4687]: I0228 09:26:20.612340 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:21 crc kubenswrapper[4687]: I0228 09:26:21.007603 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlf7f"] Feb 28 09:26:21 crc kubenswrapper[4687]: I0228 09:26:21.427789 4687 generic.go:334] "Generic (PLEG): container finished" podID="6a5ffade-2391-4290-852c-4058a3a63a20" containerID="64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5" exitCode=0 Feb 28 09:26:21 crc kubenswrapper[4687]: I0228 09:26:21.427895 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerDied","Data":"64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5"} Feb 28 09:26:21 crc kubenswrapper[4687]: I0228 09:26:21.428634 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerStarted","Data":"510ede216b7ab6c72bb005d9ea5698790ddae0747f0b4d7964b2befef3f52df7"} Feb 28 09:26:22 crc kubenswrapper[4687]: I0228 09:26:22.440516 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerStarted","Data":"8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1"} Feb 28 09:26:24 crc kubenswrapper[4687]: I0228 09:26:24.458804 4687 generic.go:334] "Generic (PLEG): container finished" podID="6a5ffade-2391-4290-852c-4058a3a63a20" containerID="8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1" exitCode=0 Feb 28 09:26:24 crc kubenswrapper[4687]: I0228 09:26:24.458867 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerDied","Data":"8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1"} Feb 28 09:26:25 crc kubenswrapper[4687]: I0228 09:26:25.468261 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerStarted","Data":"49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe"} Feb 28 09:26:25 crc kubenswrapper[4687]: I0228 09:26:25.484609 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rlf7f" podStartSLOduration=1.9566713409999998 podStartE2EDuration="5.484593028s" podCreationTimestamp="2026-02-28 09:26:20 +0000 UTC" firstStartedPulling="2026-02-28 09:26:21.429545605 +0000 UTC m=+1373.120114942" lastFinishedPulling="2026-02-28 09:26:24.957467293 +0000 UTC m=+1376.648036629" observedRunningTime="2026-02-28 09:26:25.480430591 +0000 UTC m=+1377.170999918" watchObservedRunningTime="2026-02-28 09:26:25.484593028 +0000 UTC m=+1377.175162365" Feb 28 09:26:30 crc kubenswrapper[4687]: I0228 09:26:30.612797 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:30 crc kubenswrapper[4687]: I0228 09:26:30.613389 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:30 crc kubenswrapper[4687]: I0228 09:26:30.647975 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:31 crc kubenswrapper[4687]: I0228 09:26:31.552129 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:31 crc kubenswrapper[4687]: I0228 09:26:31.601792 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlf7f"] Feb 28 09:26:33 crc kubenswrapper[4687]: I0228 09:26:33.537794 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rlf7f" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="registry-server" containerID="cri-o://49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe" gracePeriod=2 Feb 28 09:26:33 crc kubenswrapper[4687]: I0228 09:26:33.942663 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.054919 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-utilities\") pod \"6a5ffade-2391-4290-852c-4058a3a63a20\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.055306 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtrdb\" (UniqueName: \"kubernetes.io/projected/6a5ffade-2391-4290-852c-4058a3a63a20-kube-api-access-mtrdb\") pod \"6a5ffade-2391-4290-852c-4058a3a63a20\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.055484 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-catalog-content\") pod \"6a5ffade-2391-4290-852c-4058a3a63a20\" (UID: \"6a5ffade-2391-4290-852c-4058a3a63a20\") " Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.055614 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-utilities" (OuterVolumeSpecName: "utilities") pod "6a5ffade-2391-4290-852c-4058a3a63a20" (UID: "6a5ffade-2391-4290-852c-4058a3a63a20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.055996 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.060498 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5ffade-2391-4290-852c-4058a3a63a20-kube-api-access-mtrdb" (OuterVolumeSpecName: "kube-api-access-mtrdb") pod "6a5ffade-2391-4290-852c-4058a3a63a20" (UID: "6a5ffade-2391-4290-852c-4058a3a63a20"). InnerVolumeSpecName "kube-api-access-mtrdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.152554 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a5ffade-2391-4290-852c-4058a3a63a20" (UID: "6a5ffade-2391-4290-852c-4058a3a63a20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.158309 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a5ffade-2391-4290-852c-4058a3a63a20-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.158353 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtrdb\" (UniqueName: \"kubernetes.io/projected/6a5ffade-2391-4290-852c-4058a3a63a20-kube-api-access-mtrdb\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.545837 4687 generic.go:334] "Generic (PLEG): container finished" podID="6a5ffade-2391-4290-852c-4058a3a63a20" containerID="49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe" exitCode=0 Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.545882 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerDied","Data":"49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe"} Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.545905 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlf7f" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.545940 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlf7f" event={"ID":"6a5ffade-2391-4290-852c-4058a3a63a20","Type":"ContainerDied","Data":"510ede216b7ab6c72bb005d9ea5698790ddae0747f0b4d7964b2befef3f52df7"} Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.545963 4687 scope.go:117] "RemoveContainer" containerID="49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.564061 4687 scope.go:117] "RemoveContainer" containerID="8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.575283 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlf7f"] Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.581944 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rlf7f"] Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.601053 4687 scope.go:117] "RemoveContainer" containerID="64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.619890 4687 scope.go:117] "RemoveContainer" containerID="49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe" Feb 28 09:26:34 crc kubenswrapper[4687]: E0228 09:26:34.620310 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe\": container with ID starting with 49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe not found: ID does not exist" containerID="49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.620342 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe"} err="failed to get container status \"49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe\": rpc error: code = NotFound desc = could not find container \"49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe\": container with ID starting with 49206e59152924ade0ce60b65f3e0ebdb9bcabf5053d9bc6b162fc74040233fe not found: ID does not exist" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.620363 4687 scope.go:117] "RemoveContainer" containerID="8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1" Feb 28 09:26:34 crc kubenswrapper[4687]: E0228 09:26:34.620645 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1\": container with ID starting with 8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1 not found: ID does not exist" containerID="8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.620681 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1"} err="failed to get container status \"8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1\": rpc error: code = NotFound desc = could not find container \"8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1\": container with ID starting with 8be98ebe783a7913fe2e898a70f6daeafd2ab8d4a8569d922883e02f2980f6c1 not found: ID does not exist" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.620708 4687 scope.go:117] "RemoveContainer" containerID="64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5" Feb 28 09:26:34 crc kubenswrapper[4687]: E0228 09:26:34.620981 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5\": container with ID starting with 64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5 not found: ID does not exist" containerID="64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.621004 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5"} err="failed to get container status \"64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5\": rpc error: code = NotFound desc = could not find container \"64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5\": container with ID starting with 64cfd3978856e515599da1e8bca8fc74de960253c77879214fc3a8f6409a30d5 not found: ID does not exist" Feb 28 09:26:34 crc kubenswrapper[4687]: I0228 09:26:34.665145 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" path="/var/lib/kubelet/pods/6a5ffade-2391-4290-852c-4058a3a63a20/volumes" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.447372 4687 scope.go:117] "RemoveContainer" containerID="028844f2d4127d97e4dcbbf0a6c2f4aa6f538feb591e1cd7ad283e048ad0153f" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.465812 4687 scope.go:117] "RemoveContainer" containerID="fe9ba56e9608511d072698b0a6f39183abd2a7895b689c86907310814b551612" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.501518 4687 scope.go:117] "RemoveContainer" containerID="55cff91936e24c603712a05134d1af3e9e2eab28a8a118594290f9969c5201d8" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.534255 4687 scope.go:117] "RemoveContainer" containerID="8bd1539f05f84dff93650ce81fe1fb27a301643199250c07815c3f641b7b68d3" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.892577 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ms7l6"] Feb 28 09:26:41 crc kubenswrapper[4687]: E0228 09:26:41.892999 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="extract-content" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.893035 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="extract-content" Feb 28 09:26:41 crc kubenswrapper[4687]: E0228 09:26:41.893062 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="registry-server" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.893068 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="registry-server" Feb 28 09:26:41 crc kubenswrapper[4687]: E0228 09:26:41.893085 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="extract-utilities" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.893090 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="extract-utilities" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.894175 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5ffade-2391-4290-852c-4058a3a63a20" containerName="registry-server" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.895559 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:41 crc kubenswrapper[4687]: I0228 09:26:41.907367 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ms7l6"] Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.005106 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-utilities\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.005150 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-catalog-content\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.005214 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kwp\" (UniqueName: \"kubernetes.io/projected/848620c2-360a-4ba3-b5d0-14da997f77de-kube-api-access-w9kwp\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.106859 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-utilities\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.106907 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-catalog-content\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.106935 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kwp\" (UniqueName: \"kubernetes.io/projected/848620c2-360a-4ba3-b5d0-14da997f77de-kube-api-access-w9kwp\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.107600 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-utilities\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.107650 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-catalog-content\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.124082 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kwp\" (UniqueName: \"kubernetes.io/projected/848620c2-360a-4ba3-b5d0-14da997f77de-kube-api-access-w9kwp\") pod \"certified-operators-ms7l6\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.212102 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:42 crc kubenswrapper[4687]: I0228 09:26:42.620987 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ms7l6"] Feb 28 09:26:42 crc kubenswrapper[4687]: E0228 09:26:42.902003 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848620c2_360a_4ba3_b5d0_14da997f77de.slice/crio-e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848620c2_360a_4ba3_b5d0_14da997f77de.slice/crio-conmon-e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409.scope\": RecentStats: unable to find data in memory cache]" Feb 28 09:26:43 crc kubenswrapper[4687]: I0228 09:26:43.626368 4687 generic.go:334] "Generic (PLEG): container finished" podID="848620c2-360a-4ba3-b5d0-14da997f77de" containerID="e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409" exitCode=0 Feb 28 09:26:43 crc kubenswrapper[4687]: I0228 09:26:43.626466 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms7l6" event={"ID":"848620c2-360a-4ba3-b5d0-14da997f77de","Type":"ContainerDied","Data":"e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409"} Feb 28 09:26:43 crc kubenswrapper[4687]: I0228 09:26:43.626639 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms7l6" event={"ID":"848620c2-360a-4ba3-b5d0-14da997f77de","Type":"ContainerStarted","Data":"77692d9b0f004828f558e81975cc80ac864171ac54df58875fbef301f73cf531"} Feb 28 09:26:44 crc kubenswrapper[4687]: I0228 09:26:44.643631 4687 generic.go:334] "Generic (PLEG): container finished" podID="848620c2-360a-4ba3-b5d0-14da997f77de" containerID="0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13" exitCode=0 Feb 28 09:26:44 crc kubenswrapper[4687]: I0228 09:26:44.643678 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms7l6" event={"ID":"848620c2-360a-4ba3-b5d0-14da997f77de","Type":"ContainerDied","Data":"0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13"} Feb 28 09:26:45 crc kubenswrapper[4687]: I0228 09:26:45.679813 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms7l6" event={"ID":"848620c2-360a-4ba3-b5d0-14da997f77de","Type":"ContainerStarted","Data":"08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087"} Feb 28 09:26:45 crc kubenswrapper[4687]: I0228 09:26:45.698075 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ms7l6" podStartSLOduration=3.219360342 podStartE2EDuration="4.698058276s" podCreationTimestamp="2026-02-28 09:26:41 +0000 UTC" firstStartedPulling="2026-02-28 09:26:43.628162746 +0000 UTC m=+1395.318732083" lastFinishedPulling="2026-02-28 09:26:45.10686068 +0000 UTC m=+1396.797430017" observedRunningTime="2026-02-28 09:26:45.692212836 +0000 UTC m=+1397.382782173" watchObservedRunningTime="2026-02-28 09:26:45.698058276 +0000 UTC m=+1397.388627613" Feb 28 09:26:52 crc kubenswrapper[4687]: I0228 09:26:52.212458 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:52 crc kubenswrapper[4687]: I0228 09:26:52.213004 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:52 crc kubenswrapper[4687]: I0228 09:26:52.246721 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:52 crc kubenswrapper[4687]: I0228 09:26:52.770694 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:52 crc kubenswrapper[4687]: I0228 09:26:52.812215 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ms7l6"] Feb 28 09:26:54 crc kubenswrapper[4687]: I0228 09:26:54.754473 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ms7l6" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="registry-server" containerID="cri-o://08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087" gracePeriod=2 Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.643289 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.679063 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-utilities\") pod \"848620c2-360a-4ba3-b5d0-14da997f77de\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.679128 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9kwp\" (UniqueName: \"kubernetes.io/projected/848620c2-360a-4ba3-b5d0-14da997f77de-kube-api-access-w9kwp\") pod \"848620c2-360a-4ba3-b5d0-14da997f77de\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.679207 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-catalog-content\") pod \"848620c2-360a-4ba3-b5d0-14da997f77de\" (UID: \"848620c2-360a-4ba3-b5d0-14da997f77de\") " Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.679879 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-utilities" (OuterVolumeSpecName: "utilities") pod "848620c2-360a-4ba3-b5d0-14da997f77de" (UID: "848620c2-360a-4ba3-b5d0-14da997f77de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.684818 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848620c2-360a-4ba3-b5d0-14da997f77de-kube-api-access-w9kwp" (OuterVolumeSpecName: "kube-api-access-w9kwp") pod "848620c2-360a-4ba3-b5d0-14da997f77de" (UID: "848620c2-360a-4ba3-b5d0-14da997f77de"). InnerVolumeSpecName "kube-api-access-w9kwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.725551 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "848620c2-360a-4ba3-b5d0-14da997f77de" (UID: "848620c2-360a-4ba3-b5d0-14da997f77de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.764608 4687 generic.go:334] "Generic (PLEG): container finished" podID="848620c2-360a-4ba3-b5d0-14da997f77de" containerID="08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087" exitCode=0 Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.764667 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms7l6" event={"ID":"848620c2-360a-4ba3-b5d0-14da997f77de","Type":"ContainerDied","Data":"08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087"} Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.764717 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ms7l6" event={"ID":"848620c2-360a-4ba3-b5d0-14da997f77de","Type":"ContainerDied","Data":"77692d9b0f004828f558e81975cc80ac864171ac54df58875fbef301f73cf531"} Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.764741 4687 scope.go:117] "RemoveContainer" containerID="08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.765498 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ms7l6" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.780341 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.780366 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848620c2-360a-4ba3-b5d0-14da997f77de-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.780376 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9kwp\" (UniqueName: \"kubernetes.io/projected/848620c2-360a-4ba3-b5d0-14da997f77de-kube-api-access-w9kwp\") on node \"crc\" DevicePath \"\"" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.783912 4687 scope.go:117] "RemoveContainer" containerID="0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.802925 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ms7l6"] Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.814585 4687 scope.go:117] "RemoveContainer" containerID="e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.820399 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ms7l6"] Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.844502 4687 scope.go:117] "RemoveContainer" containerID="08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087" Feb 28 09:26:55 crc kubenswrapper[4687]: E0228 09:26:55.844870 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087\": container with ID starting with 08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087 not found: ID does not exist" containerID="08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.844912 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087"} err="failed to get container status \"08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087\": rpc error: code = NotFound desc = could not find container \"08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087\": container with ID starting with 08468dcbcf563b191d2f50f81479e09413276c9ee705b31de0e628564d561087 not found: ID does not exist" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.844939 4687 scope.go:117] "RemoveContainer" containerID="0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13" Feb 28 09:26:55 crc kubenswrapper[4687]: E0228 09:26:55.845350 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13\": container with ID starting with 0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13 not found: ID does not exist" containerID="0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.845386 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13"} err="failed to get container status \"0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13\": rpc error: code = NotFound desc = could not find container \"0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13\": container with ID starting with 0f33381d1dca77cef5bf5ed824b95e41f5e786b0e488840662fac597a2cd6e13 not found: ID does not exist" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.845409 4687 scope.go:117] "RemoveContainer" containerID="e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409" Feb 28 09:26:55 crc kubenswrapper[4687]: E0228 09:26:55.845713 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409\": container with ID starting with e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409 not found: ID does not exist" containerID="e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409" Feb 28 09:26:55 crc kubenswrapper[4687]: I0228 09:26:55.845749 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409"} err="failed to get container status \"e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409\": rpc error: code = NotFound desc = could not find container \"e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409\": container with ID starting with e72009180cd46a6ede7fc873c359f50ddd0eaae972bf44809e6d1960e897a409 not found: ID does not exist" Feb 28 09:26:56 crc kubenswrapper[4687]: I0228 09:26:56.667557 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" path="/var/lib/kubelet/pods/848620c2-360a-4ba3-b5d0-14da997f77de/volumes" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.807424 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4lwqz"] Feb 28 09:27:05 crc kubenswrapper[4687]: E0228 09:27:05.808226 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="registry-server" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.808239 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="registry-server" Feb 28 09:27:05 crc kubenswrapper[4687]: E0228 09:27:05.808250 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="extract-content" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.808256 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="extract-content" Feb 28 09:27:05 crc kubenswrapper[4687]: E0228 09:27:05.808272 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="extract-utilities" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.808278 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="extract-utilities" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.808455 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="848620c2-360a-4ba3-b5d0-14da997f77de" containerName="registry-server" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.809722 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.815808 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lwqz"] Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.842415 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-utilities\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.842456 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-catalog-content\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.842864 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f48d\" (UniqueName: \"kubernetes.io/projected/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-kube-api-access-7f48d\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.945110 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-utilities\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.945156 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-catalog-content\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.945292 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f48d\" (UniqueName: \"kubernetes.io/projected/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-kube-api-access-7f48d\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.945620 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-utilities\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.945799 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-catalog-content\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:05 crc kubenswrapper[4687]: I0228 09:27:05.961379 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f48d\" (UniqueName: \"kubernetes.io/projected/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-kube-api-access-7f48d\") pod \"redhat-marketplace-4lwqz\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:06 crc kubenswrapper[4687]: I0228 09:27:06.125619 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:06 crc kubenswrapper[4687]: I0228 09:27:06.527454 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lwqz"] Feb 28 09:27:06 crc kubenswrapper[4687]: I0228 09:27:06.856940 4687 generic.go:334] "Generic (PLEG): container finished" podID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerID="b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da" exitCode=0 Feb 28 09:27:06 crc kubenswrapper[4687]: I0228 09:27:06.856990 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lwqz" event={"ID":"2c8f5f55-e68a-483d-b0a4-e859b5ef801a","Type":"ContainerDied","Data":"b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da"} Feb 28 09:27:06 crc kubenswrapper[4687]: I0228 09:27:06.857183 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lwqz" event={"ID":"2c8f5f55-e68a-483d-b0a4-e859b5ef801a","Type":"ContainerStarted","Data":"0f503c463abfb24fb86186f9ff4dff7136631c1b261fee644f3b35d92abf47f7"} Feb 28 09:27:07 crc kubenswrapper[4687]: I0228 09:27:07.867745 4687 generic.go:334] "Generic (PLEG): container finished" podID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerID="e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac" exitCode=0 Feb 28 09:27:07 crc kubenswrapper[4687]: I0228 09:27:07.867934 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lwqz" event={"ID":"2c8f5f55-e68a-483d-b0a4-e859b5ef801a","Type":"ContainerDied","Data":"e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac"} Feb 28 09:27:08 crc kubenswrapper[4687]: I0228 09:27:08.880012 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lwqz" event={"ID":"2c8f5f55-e68a-483d-b0a4-e859b5ef801a","Type":"ContainerStarted","Data":"3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07"} Feb 28 09:27:08 crc kubenswrapper[4687]: I0228 09:27:08.899273 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4lwqz" podStartSLOduration=2.416978359 podStartE2EDuration="3.899257036s" podCreationTimestamp="2026-02-28 09:27:05 +0000 UTC" firstStartedPulling="2026-02-28 09:27:06.858257968 +0000 UTC m=+1418.548827305" lastFinishedPulling="2026-02-28 09:27:08.340536645 +0000 UTC m=+1420.031105982" observedRunningTime="2026-02-28 09:27:08.89347289 +0000 UTC m=+1420.584042228" watchObservedRunningTime="2026-02-28 09:27:08.899257036 +0000 UTC m=+1420.589826374" Feb 28 09:27:16 crc kubenswrapper[4687]: I0228 09:27:16.126492 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:16 crc kubenswrapper[4687]: I0228 09:27:16.126883 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:16 crc kubenswrapper[4687]: I0228 09:27:16.161046 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:16 crc kubenswrapper[4687]: I0228 09:27:16.987374 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:17 crc kubenswrapper[4687]: I0228 09:27:17.025563 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lwqz"] Feb 28 09:27:18 crc kubenswrapper[4687]: I0228 09:27:18.964874 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4lwqz" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="registry-server" containerID="cri-o://3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07" gracePeriod=2 Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.351514 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.520384 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f48d\" (UniqueName: \"kubernetes.io/projected/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-kube-api-access-7f48d\") pod \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.520498 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-catalog-content\") pod \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.520548 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-utilities\") pod \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\" (UID: \"2c8f5f55-e68a-483d-b0a4-e859b5ef801a\") " Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.521282 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-utilities" (OuterVolumeSpecName: "utilities") pod "2c8f5f55-e68a-483d-b0a4-e859b5ef801a" (UID: "2c8f5f55-e68a-483d-b0a4-e859b5ef801a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.526253 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-kube-api-access-7f48d" (OuterVolumeSpecName: "kube-api-access-7f48d") pod "2c8f5f55-e68a-483d-b0a4-e859b5ef801a" (UID: "2c8f5f55-e68a-483d-b0a4-e859b5ef801a"). InnerVolumeSpecName "kube-api-access-7f48d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.542447 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c8f5f55-e68a-483d-b0a4-e859b5ef801a" (UID: "2c8f5f55-e68a-483d-b0a4-e859b5ef801a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.622703 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f48d\" (UniqueName: \"kubernetes.io/projected/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-kube-api-access-7f48d\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.622734 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.622746 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8f5f55-e68a-483d-b0a4-e859b5ef801a-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.975549 4687 generic.go:334] "Generic (PLEG): container finished" podID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerID="3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07" exitCode=0 Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.975595 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lwqz" event={"ID":"2c8f5f55-e68a-483d-b0a4-e859b5ef801a","Type":"ContainerDied","Data":"3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07"} Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.975615 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lwqz" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.975627 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lwqz" event={"ID":"2c8f5f55-e68a-483d-b0a4-e859b5ef801a","Type":"ContainerDied","Data":"0f503c463abfb24fb86186f9ff4dff7136631c1b261fee644f3b35d92abf47f7"} Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.975648 4687 scope.go:117] "RemoveContainer" containerID="3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07" Feb 28 09:27:19 crc kubenswrapper[4687]: I0228 09:27:19.994359 4687 scope.go:117] "RemoveContainer" containerID="e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.002476 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lwqz"] Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.010912 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lwqz"] Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.014525 4687 scope.go:117] "RemoveContainer" containerID="b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.048694 4687 scope.go:117] "RemoveContainer" containerID="3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07" Feb 28 09:27:20 crc kubenswrapper[4687]: E0228 09:27:20.048975 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07\": container with ID starting with 3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07 not found: ID does not exist" containerID="3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.049009 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07"} err="failed to get container status \"3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07\": rpc error: code = NotFound desc = could not find container \"3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07\": container with ID starting with 3d408cce6537866eefddc02bfd405b3505c1b14de1015206d65db9e9641c0c07 not found: ID does not exist" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.049050 4687 scope.go:117] "RemoveContainer" containerID="e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac" Feb 28 09:27:20 crc kubenswrapper[4687]: E0228 09:27:20.049460 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac\": container with ID starting with e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac not found: ID does not exist" containerID="e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.049483 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac"} err="failed to get container status \"e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac\": rpc error: code = NotFound desc = could not find container \"e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac\": container with ID starting with e72ad6649effb6765575da150325e707cdd51af8fa170bd67696a5df551c73ac not found: ID does not exist" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.049501 4687 scope.go:117] "RemoveContainer" containerID="b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da" Feb 28 09:27:20 crc kubenswrapper[4687]: E0228 09:27:20.049737 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da\": container with ID starting with b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da not found: ID does not exist" containerID="b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.049758 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da"} err="failed to get container status \"b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da\": rpc error: code = NotFound desc = could not find container \"b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da\": container with ID starting with b21a7970299c2d3f8cff2db0b010a2fd948b95ee51c1cb04db1ac68e4a8323da not found: ID does not exist" Feb 28 09:27:20 crc kubenswrapper[4687]: I0228 09:27:20.667473 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" path="/var/lib/kubelet/pods/2c8f5f55-e68a-483d-b0a4-e859b5ef801a/volumes" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.497924 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ms5kd"] Feb 28 09:27:35 crc kubenswrapper[4687]: E0228 09:27:35.499876 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="registry-server" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.499950 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="registry-server" Feb 28 09:27:35 crc kubenswrapper[4687]: E0228 09:27:35.500088 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="extract-content" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.500163 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="extract-content" Feb 28 09:27:35 crc kubenswrapper[4687]: E0228 09:27:35.500232 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="extract-utilities" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.500283 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="extract-utilities" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.500552 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8f5f55-e68a-483d-b0a4-e859b5ef801a" containerName="registry-server" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.502047 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.515994 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ms5kd"] Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.531728 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-utilities\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.531972 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-catalog-content\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.532149 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbpdn\" (UniqueName: \"kubernetes.io/projected/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-kube-api-access-sbpdn\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.634321 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-catalog-content\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.634406 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbpdn\" (UniqueName: \"kubernetes.io/projected/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-kube-api-access-sbpdn\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.634438 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-utilities\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.634800 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-catalog-content\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.634906 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-utilities\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.651377 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbpdn\" (UniqueName: \"kubernetes.io/projected/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-kube-api-access-sbpdn\") pod \"community-operators-ms5kd\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:35 crc kubenswrapper[4687]: I0228 09:27:35.819539 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:36 crc kubenswrapper[4687]: I0228 09:27:36.118903 4687 generic.go:334] "Generic (PLEG): container finished" podID="e607377f-9f4c-4f40-8d5c-17487eb054b8" containerID="39d1f91383baf2a1f96e78d052edf975fce634d275a2d27c013199a926cf47b8" exitCode=0 Feb 28 09:27:36 crc kubenswrapper[4687]: I0228 09:27:36.118946 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" event={"ID":"e607377f-9f4c-4f40-8d5c-17487eb054b8","Type":"ContainerDied","Data":"39d1f91383baf2a1f96e78d052edf975fce634d275a2d27c013199a926cf47b8"} Feb 28 09:27:36 crc kubenswrapper[4687]: I0228 09:27:36.265852 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ms5kd"] Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.130246 4687 generic.go:334] "Generic (PLEG): container finished" podID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerID="1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8" exitCode=0 Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.130395 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerDied","Data":"1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8"} Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.131251 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerStarted","Data":"aa5087d8ddfad4dc469882915baf9460c6e846df2c7aeb444ffb657a7f3cfc1d"} Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.132103 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.436184 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.471521 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-bootstrap-combined-ca-bundle\") pod \"e607377f-9f4c-4f40-8d5c-17487eb054b8\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.471700 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-inventory\") pod \"e607377f-9f4c-4f40-8d5c-17487eb054b8\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.471768 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-ssh-key-openstack-edpm-ipam\") pod \"e607377f-9f4c-4f40-8d5c-17487eb054b8\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.471892 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7dln\" (UniqueName: \"kubernetes.io/projected/e607377f-9f4c-4f40-8d5c-17487eb054b8-kube-api-access-p7dln\") pod \"e607377f-9f4c-4f40-8d5c-17487eb054b8\" (UID: \"e607377f-9f4c-4f40-8d5c-17487eb054b8\") " Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.477898 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e607377f-9f4c-4f40-8d5c-17487eb054b8" (UID: "e607377f-9f4c-4f40-8d5c-17487eb054b8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.477915 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e607377f-9f4c-4f40-8d5c-17487eb054b8-kube-api-access-p7dln" (OuterVolumeSpecName: "kube-api-access-p7dln") pod "e607377f-9f4c-4f40-8d5c-17487eb054b8" (UID: "e607377f-9f4c-4f40-8d5c-17487eb054b8"). InnerVolumeSpecName "kube-api-access-p7dln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.495823 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-inventory" (OuterVolumeSpecName: "inventory") pod "e607377f-9f4c-4f40-8d5c-17487eb054b8" (UID: "e607377f-9f4c-4f40-8d5c-17487eb054b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.496926 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e607377f-9f4c-4f40-8d5c-17487eb054b8" (UID: "e607377f-9f4c-4f40-8d5c-17487eb054b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.573486 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.573515 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7dln\" (UniqueName: \"kubernetes.io/projected/e607377f-9f4c-4f40-8d5c-17487eb054b8-kube-api-access-p7dln\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.573525 4687 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:37 crc kubenswrapper[4687]: I0228 09:27:37.573542 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e607377f-9f4c-4f40-8d5c-17487eb054b8-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.143077 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerStarted","Data":"ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f"} Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.148753 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" event={"ID":"e607377f-9f4c-4f40-8d5c-17487eb054b8","Type":"ContainerDied","Data":"58816113590aea9b44456eaf4228a0791819e48a9db0989c738030537185249e"} Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.148816 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58816113590aea9b44456eaf4228a0791819e48a9db0989c738030537185249e" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.148916 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.218817 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt"] Feb 28 09:27:38 crc kubenswrapper[4687]: E0228 09:27:38.219468 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e607377f-9f4c-4f40-8d5c-17487eb054b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.219488 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e607377f-9f4c-4f40-8d5c-17487eb054b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.219697 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e607377f-9f4c-4f40-8d5c-17487eb054b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.220310 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.227913 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.228384 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.228833 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.229149 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.256227 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt"] Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.289439 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.289660 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrcm\" (UniqueName: \"kubernetes.io/projected/2162138d-1397-4721-adeb-73e30bf37580-kube-api-access-2lrcm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.289710 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.391357 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrcm\" (UniqueName: \"kubernetes.io/projected/2162138d-1397-4721-adeb-73e30bf37580-kube-api-access-2lrcm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.391512 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.391879 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.395524 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.395565 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.405798 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrcm\" (UniqueName: \"kubernetes.io/projected/2162138d-1397-4721-adeb-73e30bf37580-kube-api-access-2lrcm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:38 crc kubenswrapper[4687]: I0228 09:27:38.552287 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:27:39 crc kubenswrapper[4687]: I0228 09:27:39.029567 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt"] Feb 28 09:27:39 crc kubenswrapper[4687]: I0228 09:27:39.158953 4687 generic.go:334] "Generic (PLEG): container finished" podID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerID="ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f" exitCode=0 Feb 28 09:27:39 crc kubenswrapper[4687]: I0228 09:27:39.158987 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerDied","Data":"ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f"} Feb 28 09:27:39 crc kubenswrapper[4687]: I0228 09:27:39.160697 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" event={"ID":"2162138d-1397-4721-adeb-73e30bf37580","Type":"ContainerStarted","Data":"a75d2897199c85ae54258cf29078dd8dbf669f1a8ed66c357de065744daac2f5"} Feb 28 09:27:40 crc kubenswrapper[4687]: I0228 09:27:40.172295 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" event={"ID":"2162138d-1397-4721-adeb-73e30bf37580","Type":"ContainerStarted","Data":"60a4d48147a7c756c2d1a0ce2ccb082e489228281f33c8d04555f2cf52e9212f"} Feb 28 09:27:40 crc kubenswrapper[4687]: I0228 09:27:40.176304 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerStarted","Data":"6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c"} Feb 28 09:27:40 crc kubenswrapper[4687]: I0228 09:27:40.193497 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" podStartSLOduration=1.631441707 podStartE2EDuration="2.193479719s" podCreationTimestamp="2026-02-28 09:27:38 +0000 UTC" firstStartedPulling="2026-02-28 09:27:39.033281797 +0000 UTC m=+1450.723851124" lastFinishedPulling="2026-02-28 09:27:39.595319798 +0000 UTC m=+1451.285889136" observedRunningTime="2026-02-28 09:27:40.189455111 +0000 UTC m=+1451.880024449" watchObservedRunningTime="2026-02-28 09:27:40.193479719 +0000 UTC m=+1451.884049056" Feb 28 09:27:40 crc kubenswrapper[4687]: I0228 09:27:40.208971 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ms5kd" podStartSLOduration=2.729505988 podStartE2EDuration="5.208944217s" podCreationTimestamp="2026-02-28 09:27:35 +0000 UTC" firstStartedPulling="2026-02-28 09:27:37.131818826 +0000 UTC m=+1448.822388163" lastFinishedPulling="2026-02-28 09:27:39.611257055 +0000 UTC m=+1451.301826392" observedRunningTime="2026-02-28 09:27:40.201870436 +0000 UTC m=+1451.892439772" watchObservedRunningTime="2026-02-28 09:27:40.208944217 +0000 UTC m=+1451.899513553" Feb 28 09:27:45 crc kubenswrapper[4687]: I0228 09:27:45.820384 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:45 crc kubenswrapper[4687]: I0228 09:27:45.820625 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:45 crc kubenswrapper[4687]: I0228 09:27:45.854415 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:46 crc kubenswrapper[4687]: I0228 09:27:46.265592 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:46 crc kubenswrapper[4687]: I0228 09:27:46.310647 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ms5kd"] Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.249975 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ms5kd" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="registry-server" containerID="cri-o://6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c" gracePeriod=2 Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.590909 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.682753 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-utilities\") pod \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.682881 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbpdn\" (UniqueName: \"kubernetes.io/projected/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-kube-api-access-sbpdn\") pod \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.682963 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-catalog-content\") pod \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\" (UID: \"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6\") " Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.683528 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-utilities" (OuterVolumeSpecName: "utilities") pod "96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" (UID: "96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.687620 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-kube-api-access-sbpdn" (OuterVolumeSpecName: "kube-api-access-sbpdn") pod "96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" (UID: "96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6"). InnerVolumeSpecName "kube-api-access-sbpdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.720842 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" (UID: "96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.785441 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbpdn\" (UniqueName: \"kubernetes.io/projected/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-kube-api-access-sbpdn\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.785475 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:48 crc kubenswrapper[4687]: I0228 09:27:48.785489 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.261440 4687 generic.go:334] "Generic (PLEG): container finished" podID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerID="6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c" exitCode=0 Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.261477 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerDied","Data":"6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c"} Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.261504 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms5kd" event={"ID":"96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6","Type":"ContainerDied","Data":"aa5087d8ddfad4dc469882915baf9460c6e846df2c7aeb444ffb657a7f3cfc1d"} Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.261523 4687 scope.go:117] "RemoveContainer" containerID="6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.261529 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms5kd" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.279050 4687 scope.go:117] "RemoveContainer" containerID="ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.289179 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ms5kd"] Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.295449 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ms5kd"] Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.316233 4687 scope.go:117] "RemoveContainer" containerID="1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.335829 4687 scope.go:117] "RemoveContainer" containerID="6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c" Feb 28 09:27:49 crc kubenswrapper[4687]: E0228 09:27:49.336209 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c\": container with ID starting with 6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c not found: ID does not exist" containerID="6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.336291 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c"} err="failed to get container status \"6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c\": rpc error: code = NotFound desc = could not find container \"6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c\": container with ID starting with 6618a77fd962f8f0636b46101b04c935660f53179b1b10ea37043aa575b3eb0c not found: ID does not exist" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.336371 4687 scope.go:117] "RemoveContainer" containerID="ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f" Feb 28 09:27:49 crc kubenswrapper[4687]: E0228 09:27:49.336702 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f\": container with ID starting with ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f not found: ID does not exist" containerID="ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.336736 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f"} err="failed to get container status \"ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f\": rpc error: code = NotFound desc = could not find container \"ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f\": container with ID starting with ee4ea63dd295a1a67379a68dd25ee77474ce12157792471840e4923ea1a9aa1f not found: ID does not exist" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.336765 4687 scope.go:117] "RemoveContainer" containerID="1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8" Feb 28 09:27:49 crc kubenswrapper[4687]: E0228 09:27:49.336959 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8\": container with ID starting with 1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8 not found: ID does not exist" containerID="1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8" Feb 28 09:27:49 crc kubenswrapper[4687]: I0228 09:27:49.336980 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8"} err="failed to get container status \"1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8\": rpc error: code = NotFound desc = could not find container \"1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8\": container with ID starting with 1f36c332d8b11dde52f9ef4b34e3a836f2f40502419eab109db1c53115266dd8 not found: ID does not exist" Feb 28 09:27:50 crc kubenswrapper[4687]: I0228 09:27:50.666290 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" path="/var/lib/kubelet/pods/96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6/volumes" Feb 28 09:27:55 crc kubenswrapper[4687]: I0228 09:27:55.002080 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:27:55 crc kubenswrapper[4687]: I0228 09:27:55.002474 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.136852 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537848-g7jxk"] Feb 28 09:28:00 crc kubenswrapper[4687]: E0228 09:28:00.137654 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="registry-server" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.137668 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="registry-server" Feb 28 09:28:00 crc kubenswrapper[4687]: E0228 09:28:00.137682 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="extract-utilities" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.137688 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="extract-utilities" Feb 28 09:28:00 crc kubenswrapper[4687]: E0228 09:28:00.137701 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="extract-content" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.137707 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="extract-content" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.137924 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a4ba6e-c2ef-44de-8d35-d79dad4a4ea6" containerName="registry-server" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.138594 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.140473 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.140602 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.141826 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.151577 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537848-g7jxk"] Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.219167 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78vb\" (UniqueName: \"kubernetes.io/projected/d741a584-384a-4d5a-bf8e-07e2603f0af0-kube-api-access-p78vb\") pod \"auto-csr-approver-29537848-g7jxk\" (UID: \"d741a584-384a-4d5a-bf8e-07e2603f0af0\") " pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.321469 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78vb\" (UniqueName: \"kubernetes.io/projected/d741a584-384a-4d5a-bf8e-07e2603f0af0-kube-api-access-p78vb\") pod \"auto-csr-approver-29537848-g7jxk\" (UID: \"d741a584-384a-4d5a-bf8e-07e2603f0af0\") " pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.338988 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78vb\" (UniqueName: \"kubernetes.io/projected/d741a584-384a-4d5a-bf8e-07e2603f0af0-kube-api-access-p78vb\") pod \"auto-csr-approver-29537848-g7jxk\" (UID: \"d741a584-384a-4d5a-bf8e-07e2603f0af0\") " pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.455565 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:00 crc kubenswrapper[4687]: I0228 09:28:00.872716 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537848-g7jxk"] Feb 28 09:28:01 crc kubenswrapper[4687]: I0228 09:28:01.372055 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" event={"ID":"d741a584-384a-4d5a-bf8e-07e2603f0af0","Type":"ContainerStarted","Data":"a19f3ee3d900f695c9d30f92de6f5da363ae69cd46969e74fbe25d300e4c14b6"} Feb 28 09:28:02 crc kubenswrapper[4687]: I0228 09:28:02.383880 4687 generic.go:334] "Generic (PLEG): container finished" podID="d741a584-384a-4d5a-bf8e-07e2603f0af0" containerID="dc90b95f5294f7a0c35cef8d5d8a70312b210c3e425f3907d089df85c9dbee95" exitCode=0 Feb 28 09:28:02 crc kubenswrapper[4687]: I0228 09:28:02.383961 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" event={"ID":"d741a584-384a-4d5a-bf8e-07e2603f0af0","Type":"ContainerDied","Data":"dc90b95f5294f7a0c35cef8d5d8a70312b210c3e425f3907d089df85c9dbee95"} Feb 28 09:28:03 crc kubenswrapper[4687]: I0228 09:28:03.680988 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:03 crc kubenswrapper[4687]: I0228 09:28:03.812752 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78vb\" (UniqueName: \"kubernetes.io/projected/d741a584-384a-4d5a-bf8e-07e2603f0af0-kube-api-access-p78vb\") pod \"d741a584-384a-4d5a-bf8e-07e2603f0af0\" (UID: \"d741a584-384a-4d5a-bf8e-07e2603f0af0\") " Feb 28 09:28:03 crc kubenswrapper[4687]: I0228 09:28:03.819931 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d741a584-384a-4d5a-bf8e-07e2603f0af0-kube-api-access-p78vb" (OuterVolumeSpecName: "kube-api-access-p78vb") pod "d741a584-384a-4d5a-bf8e-07e2603f0af0" (UID: "d741a584-384a-4d5a-bf8e-07e2603f0af0"). InnerVolumeSpecName "kube-api-access-p78vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:28:03 crc kubenswrapper[4687]: I0228 09:28:03.915129 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78vb\" (UniqueName: \"kubernetes.io/projected/d741a584-384a-4d5a-bf8e-07e2603f0af0-kube-api-access-p78vb\") on node \"crc\" DevicePath \"\"" Feb 28 09:28:04 crc kubenswrapper[4687]: I0228 09:28:04.407418 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" event={"ID":"d741a584-384a-4d5a-bf8e-07e2603f0af0","Type":"ContainerDied","Data":"a19f3ee3d900f695c9d30f92de6f5da363ae69cd46969e74fbe25d300e4c14b6"} Feb 28 09:28:04 crc kubenswrapper[4687]: I0228 09:28:04.407487 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19f3ee3d900f695c9d30f92de6f5da363ae69cd46969e74fbe25d300e4c14b6" Feb 28 09:28:04 crc kubenswrapper[4687]: I0228 09:28:04.407578 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537848-g7jxk" Feb 28 09:28:04 crc kubenswrapper[4687]: I0228 09:28:04.737061 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537842-cgjsw"] Feb 28 09:28:04 crc kubenswrapper[4687]: I0228 09:28:04.743524 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537842-cgjsw"] Feb 28 09:28:06 crc kubenswrapper[4687]: I0228 09:28:06.667046 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70abdfed-0686-450a-b900-2eda9b68cec7" path="/var/lib/kubelet/pods/70abdfed-0686-450a-b900-2eda9b68cec7/volumes" Feb 28 09:28:25 crc kubenswrapper[4687]: I0228 09:28:25.002365 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:28:25 crc kubenswrapper[4687]: I0228 09:28:25.002903 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:28:41 crc kubenswrapper[4687]: I0228 09:28:41.658537 4687 scope.go:117] "RemoveContainer" containerID="37a55319d850b5712035136020c7276544e69623d55169e9d4ad009f9f951568" Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.002778 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.003512 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.003572 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.004552 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.004616 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" gracePeriod=600 Feb 28 09:28:55 crc kubenswrapper[4687]: E0228 09:28:55.125369 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.888411 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" exitCode=0 Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.888509 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f"} Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.888734 4687 scope.go:117] "RemoveContainer" containerID="26defbc0a15ba55a0f8e3a7678fa01c73c7ea2162c34ef63cf8b44425106ed7e" Feb 28 09:28:55 crc kubenswrapper[4687]: I0228 09:28:55.889238 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:28:55 crc kubenswrapper[4687]: E0228 09:28:55.889633 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:29:08 crc kubenswrapper[4687]: I0228 09:29:08.037142 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hggb5"] Feb 28 09:29:08 crc kubenswrapper[4687]: I0228 09:29:08.045193 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hggb5"] Feb 28 09:29:08 crc kubenswrapper[4687]: I0228 09:29:08.666373 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6ebc98-5929-43f4-8973-a8036ba6b8ca" path="/var/lib/kubelet/pods/9d6ebc98-5929-43f4-8973-a8036ba6b8ca/volumes" Feb 28 09:29:09 crc kubenswrapper[4687]: I0228 09:29:09.024259 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5jbj4"] Feb 28 09:29:09 crc kubenswrapper[4687]: I0228 09:29:09.033226 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5jbj4"] Feb 28 09:29:09 crc kubenswrapper[4687]: I0228 09:29:09.041902 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-598c-account-create-update-k5s5m"] Feb 28 09:29:09 crc kubenswrapper[4687]: I0228 09:29:09.048794 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3732-account-create-update-zndhg"] Feb 28 09:29:09 crc kubenswrapper[4687]: I0228 09:29:09.054067 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-598c-account-create-update-k5s5m"] Feb 28 09:29:09 crc kubenswrapper[4687]: I0228 09:29:09.059155 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3732-account-create-update-zndhg"] Feb 28 09:29:10 crc kubenswrapper[4687]: I0228 09:29:10.666713 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e84feee-0007-4202-a1b7-cf6a25ea3261" path="/var/lib/kubelet/pods/3e84feee-0007-4202-a1b7-cf6a25ea3261/volumes" Feb 28 09:29:10 crc kubenswrapper[4687]: I0228 09:29:10.667498 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8cf1bc0-26d7-4e51-895b-425350692fef" path="/var/lib/kubelet/pods/d8cf1bc0-26d7-4e51-895b-425350692fef/volumes" Feb 28 09:29:10 crc kubenswrapper[4687]: I0228 09:29:10.668091 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ef2e04-72ff-4461-a018-d126bd85f161" path="/var/lib/kubelet/pods/e0ef2e04-72ff-4461-a018-d126bd85f161/volumes" Feb 28 09:29:11 crc kubenswrapper[4687]: I0228 09:29:11.657137 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:29:11 crc kubenswrapper[4687]: E0228 09:29:11.657480 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:29:12 crc kubenswrapper[4687]: I0228 09:29:12.026749 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-knnj5"] Feb 28 09:29:12 crc kubenswrapper[4687]: I0228 09:29:12.033775 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7b4d-account-create-update-6cbx6"] Feb 28 09:29:12 crc kubenswrapper[4687]: I0228 09:29:12.041144 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-knnj5"] Feb 28 09:29:12 crc kubenswrapper[4687]: I0228 09:29:12.047704 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7b4d-account-create-update-6cbx6"] Feb 28 09:29:12 crc kubenswrapper[4687]: I0228 09:29:12.667105 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05bb3807-5df1-422e-bc01-f42bec6ed506" path="/var/lib/kubelet/pods/05bb3807-5df1-422e-bc01-f42bec6ed506/volumes" Feb 28 09:29:12 crc kubenswrapper[4687]: I0228 09:29:12.667815 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2212f7e-7ffb-4643-9f92-151ac33b6062" path="/var/lib/kubelet/pods/d2212f7e-7ffb-4643-9f92-151ac33b6062/volumes" Feb 28 09:29:24 crc kubenswrapper[4687]: I0228 09:29:24.656455 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:29:24 crc kubenswrapper[4687]: E0228 09:29:24.657045 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:29:28 crc kubenswrapper[4687]: I0228 09:29:28.173147 4687 generic.go:334] "Generic (PLEG): container finished" podID="2162138d-1397-4721-adeb-73e30bf37580" containerID="60a4d48147a7c756c2d1a0ce2ccb082e489228281f33c8d04555f2cf52e9212f" exitCode=0 Feb 28 09:29:28 crc kubenswrapper[4687]: I0228 09:29:28.173243 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" event={"ID":"2162138d-1397-4721-adeb-73e30bf37580","Type":"ContainerDied","Data":"60a4d48147a7c756c2d1a0ce2ccb082e489228281f33c8d04555f2cf52e9212f"} Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.023649 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vckkp"] Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.032903 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vckkp"] Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.494446 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.611080 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrcm\" (UniqueName: \"kubernetes.io/projected/2162138d-1397-4721-adeb-73e30bf37580-kube-api-access-2lrcm\") pod \"2162138d-1397-4721-adeb-73e30bf37580\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.611149 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-ssh-key-openstack-edpm-ipam\") pod \"2162138d-1397-4721-adeb-73e30bf37580\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.611216 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-inventory\") pod \"2162138d-1397-4721-adeb-73e30bf37580\" (UID: \"2162138d-1397-4721-adeb-73e30bf37580\") " Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.615276 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2162138d-1397-4721-adeb-73e30bf37580-kube-api-access-2lrcm" (OuterVolumeSpecName: "kube-api-access-2lrcm") pod "2162138d-1397-4721-adeb-73e30bf37580" (UID: "2162138d-1397-4721-adeb-73e30bf37580"). InnerVolumeSpecName "kube-api-access-2lrcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.632642 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-inventory" (OuterVolumeSpecName: "inventory") pod "2162138d-1397-4721-adeb-73e30bf37580" (UID: "2162138d-1397-4721-adeb-73e30bf37580"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.634380 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2162138d-1397-4721-adeb-73e30bf37580" (UID: "2162138d-1397-4721-adeb-73e30bf37580"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.713488 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.713516 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrcm\" (UniqueName: \"kubernetes.io/projected/2162138d-1397-4721-adeb-73e30bf37580-kube-api-access-2lrcm\") on node \"crc\" DevicePath \"\"" Feb 28 09:29:29 crc kubenswrapper[4687]: I0228 09:29:29.713526 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2162138d-1397-4721-adeb-73e30bf37580-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.188727 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" event={"ID":"2162138d-1397-4721-adeb-73e30bf37580","Type":"ContainerDied","Data":"a75d2897199c85ae54258cf29078dd8dbf669f1a8ed66c357de065744daac2f5"} Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.188768 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75d2897199c85ae54258cf29078dd8dbf669f1a8ed66c357de065744daac2f5" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.188774 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.259354 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4"] Feb 28 09:29:30 crc kubenswrapper[4687]: E0228 09:29:30.259840 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d741a584-384a-4d5a-bf8e-07e2603f0af0" containerName="oc" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.259858 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d741a584-384a-4d5a-bf8e-07e2603f0af0" containerName="oc" Feb 28 09:29:30 crc kubenswrapper[4687]: E0228 09:29:30.259888 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2162138d-1397-4721-adeb-73e30bf37580" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.259895 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2162138d-1397-4721-adeb-73e30bf37580" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.260149 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d741a584-384a-4d5a-bf8e-07e2603f0af0" containerName="oc" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.260162 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2162138d-1397-4721-adeb-73e30bf37580" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.260893 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.262349 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.262998 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.265447 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.265825 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.265895 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4"] Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.324734 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.324812 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.324836 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbkp\" (UniqueName: \"kubernetes.io/projected/f3bbc9b7-2863-45fb-a890-fba1253b1f63-kube-api-access-dpbkp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.425841 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.425916 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.425939 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbkp\" (UniqueName: \"kubernetes.io/projected/f3bbc9b7-2863-45fb-a890-fba1253b1f63-kube-api-access-dpbkp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.429441 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.429752 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.438976 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbkp\" (UniqueName: \"kubernetes.io/projected/f3bbc9b7-2863-45fb-a890-fba1253b1f63-kube-api-access-dpbkp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.572964 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.664300 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f20836-ec64-4206-8f2c-4db709f61459" path="/var/lib/kubelet/pods/48f20836-ec64-4206-8f2c-4db709f61459/volumes" Feb 28 09:29:30 crc kubenswrapper[4687]: I0228 09:29:30.995099 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4"] Feb 28 09:29:31 crc kubenswrapper[4687]: I0228 09:29:31.197409 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" event={"ID":"f3bbc9b7-2863-45fb-a890-fba1253b1f63","Type":"ContainerStarted","Data":"0a279c5404cd131d5ad776cdb2c19b33b4f500ffc3ee67a02c755f9aa40044eb"} Feb 28 09:29:32 crc kubenswrapper[4687]: I0228 09:29:32.221205 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" event={"ID":"f3bbc9b7-2863-45fb-a890-fba1253b1f63","Type":"ContainerStarted","Data":"b17ceacc83e2bbeb5ef97bcca07317aa1aa4b07b1e533f8eb57847d55aa08181"} Feb 28 09:29:32 crc kubenswrapper[4687]: I0228 09:29:32.246165 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" podStartSLOduration=1.761131313 podStartE2EDuration="2.246151148s" podCreationTimestamp="2026-02-28 09:29:30 +0000 UTC" firstStartedPulling="2026-02-28 09:29:31.002744981 +0000 UTC m=+1562.693314318" lastFinishedPulling="2026-02-28 09:29:31.487764816 +0000 UTC m=+1563.178334153" observedRunningTime="2026-02-28 09:29:32.239136107 +0000 UTC m=+1563.929705445" watchObservedRunningTime="2026-02-28 09:29:32.246151148 +0000 UTC m=+1563.936720485" Feb 28 09:29:33 crc kubenswrapper[4687]: I0228 09:29:33.038008 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-l9np4"] Feb 28 09:29:33 crc kubenswrapper[4687]: I0228 09:29:33.049593 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-l9np4"] Feb 28 09:29:34 crc kubenswrapper[4687]: I0228 09:29:34.665560 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8549972-64f9-4f47-a3db-42053850adb4" path="/var/lib/kubelet/pods/c8549972-64f9-4f47-a3db-42053850adb4/volumes" Feb 28 09:29:36 crc kubenswrapper[4687]: I0228 09:29:36.656487 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:29:36 crc kubenswrapper[4687]: E0228 09:29:36.656755 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.748681 4687 scope.go:117] "RemoveContainer" containerID="5471f01c51c2f9c5c3073b547ee63f530f92e564980b01ee9de3b8792e11deba" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.780801 4687 scope.go:117] "RemoveContainer" containerID="f97495ca2f3ee2c7437d7157ec557e2540e90cde7b6f582cacb30fcf12613fc4" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.811972 4687 scope.go:117] "RemoveContainer" containerID="bcf5ef88e87919d4c1b68e62847cbfc6b2632c64d9b9b64f06ca5273977960a1" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.843080 4687 scope.go:117] "RemoveContainer" containerID="6218b08f1988e15b17ea1d6e05aeacc80e08b573c118e3b0c3f4e4467e0ac852" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.875760 4687 scope.go:117] "RemoveContainer" containerID="0239e9ab22ba287acdc4e244c1de4c8f081fa58787f957df79cefe263a124ad2" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.911137 4687 scope.go:117] "RemoveContainer" containerID="edbf5171945cc4d9e4218ee79894987ce70f92e1629e0e9f8b5fb8f09e7ad5d0" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.949950 4687 scope.go:117] "RemoveContainer" containerID="f230721f719bbed5c79d6aba813ad4b0a576e7fa8df89bcf3430a86d08efa913" Feb 28 09:29:41 crc kubenswrapper[4687]: I0228 09:29:41.967642 4687 scope.go:117] "RemoveContainer" containerID="1bd614f3e473614c47200c859df1a3360530237034c4af005f1d4e7a1adf0aca" Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.028380 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5l5b9"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.033565 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-73ed-account-create-update-99ctp"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.039485 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bpfck"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.046084 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7nnfk"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.069905 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9dac-account-create-update-mzccc"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.079499 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-73ed-account-create-update-99ctp"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.086481 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4a52-account-create-update-4zmtm"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.093847 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bpfck"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.101154 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7nnfk"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.108158 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5l5b9"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.114636 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9dac-account-create-update-mzccc"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.120094 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4a52-account-create-update-4zmtm"] Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.665537 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1214eb91-e4cb-4337-ab5c-e27c0dd55151" path="/var/lib/kubelet/pods/1214eb91-e4cb-4337-ab5c-e27c0dd55151/volumes" Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.666376 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14080cd3-b175-4324-aacc-c3c47ead6896" path="/var/lib/kubelet/pods/14080cd3-b175-4324-aacc-c3c47ead6896/volumes" Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.666899 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ea0f78-cfa4-4a12-8e4e-92bc30488ad1" path="/var/lib/kubelet/pods/15ea0f78-cfa4-4a12-8e4e-92bc30488ad1/volumes" Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.667448 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d4394e-e0a9-4ea7-b670-fd088aa62341" path="/var/lib/kubelet/pods/47d4394e-e0a9-4ea7-b670-fd088aa62341/volumes" Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.668453 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae33b9ae-c76a-41e3-9497-f6cbe4f4b740" path="/var/lib/kubelet/pods/ae33b9ae-c76a-41e3-9497-f6cbe4f4b740/volumes" Feb 28 09:29:48 crc kubenswrapper[4687]: I0228 09:29:48.668979 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99cb41b-642b-4dab-bd03-a8f61456a0c5" path="/var/lib/kubelet/pods/c99cb41b-642b-4dab-bd03-a8f61456a0c5/volumes" Feb 28 09:29:49 crc kubenswrapper[4687]: I0228 09:29:49.657724 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:29:49 crc kubenswrapper[4687]: E0228 09:29:49.658003 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:29:52 crc kubenswrapper[4687]: I0228 09:29:52.027182 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8jhf4"] Feb 28 09:29:52 crc kubenswrapper[4687]: I0228 09:29:52.034370 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8jhf4"] Feb 28 09:29:52 crc kubenswrapper[4687]: I0228 09:29:52.665751 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4856ec29-c1c6-4c66-b64d-0daf938e4104" path="/var/lib/kubelet/pods/4856ec29-c1c6-4c66-b64d-0daf938e4104/volumes" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.135873 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537850-sgh2x"] Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.137485 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.139049 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.139752 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.141290 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.141846 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8"] Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.143050 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.144222 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.144624 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.147157 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537850-sgh2x"] Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.152218 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8"] Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.223297 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dc8949-e15f-4b19-a04c-7b0997af7ed9-config-volume\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.223636 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq76c\" (UniqueName: \"kubernetes.io/projected/34dc8949-e15f-4b19-a04c-7b0997af7ed9-kube-api-access-pq76c\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.223767 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48wz\" (UniqueName: \"kubernetes.io/projected/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2-kube-api-access-m48wz\") pod \"auto-csr-approver-29537850-sgh2x\" (UID: \"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2\") " pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.224297 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dc8949-e15f-4b19-a04c-7b0997af7ed9-secret-volume\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.327000 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dc8949-e15f-4b19-a04c-7b0997af7ed9-secret-volume\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.327210 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dc8949-e15f-4b19-a04c-7b0997af7ed9-config-volume\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.327297 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq76c\" (UniqueName: \"kubernetes.io/projected/34dc8949-e15f-4b19-a04c-7b0997af7ed9-kube-api-access-pq76c\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.327420 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48wz\" (UniqueName: \"kubernetes.io/projected/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2-kube-api-access-m48wz\") pod \"auto-csr-approver-29537850-sgh2x\" (UID: \"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2\") " pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.328012 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dc8949-e15f-4b19-a04c-7b0997af7ed9-config-volume\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.333975 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dc8949-e15f-4b19-a04c-7b0997af7ed9-secret-volume\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.342822 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48wz\" (UniqueName: \"kubernetes.io/projected/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2-kube-api-access-m48wz\") pod \"auto-csr-approver-29537850-sgh2x\" (UID: \"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2\") " pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.343248 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq76c\" (UniqueName: \"kubernetes.io/projected/34dc8949-e15f-4b19-a04c-7b0997af7ed9-kube-api-access-pq76c\") pod \"collect-profiles-29537850-pghm8\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.462317 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.472007 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.863838 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8"] Feb 28 09:30:00 crc kubenswrapper[4687]: I0228 09:30:00.925979 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537850-sgh2x"] Feb 28 09:30:01 crc kubenswrapper[4687]: I0228 09:30:01.458157 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" event={"ID":"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2","Type":"ContainerStarted","Data":"fa33015901d3f937db285181ae25964740f02b398f67c1fd8650c0eb1bb83559"} Feb 28 09:30:01 crc kubenswrapper[4687]: I0228 09:30:01.460605 4687 generic.go:334] "Generic (PLEG): container finished" podID="34dc8949-e15f-4b19-a04c-7b0997af7ed9" containerID="7e7546254280b8ddcbe15002f529b333f4cb050eca69e34a31bd740db24b8ff0" exitCode=0 Feb 28 09:30:01 crc kubenswrapper[4687]: I0228 09:30:01.460652 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" event={"ID":"34dc8949-e15f-4b19-a04c-7b0997af7ed9","Type":"ContainerDied","Data":"7e7546254280b8ddcbe15002f529b333f4cb050eca69e34a31bd740db24b8ff0"} Feb 28 09:30:01 crc kubenswrapper[4687]: I0228 09:30:01.460683 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" event={"ID":"34dc8949-e15f-4b19-a04c-7b0997af7ed9","Type":"ContainerStarted","Data":"8299f97ee17c5feec4788c7fd38af1af94375623dfab96ceb0c9df38818c6ed1"} Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.472759 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" event={"ID":"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2","Type":"ContainerStarted","Data":"4198b35ab426e3f30912756a63f22176d5af8a8651a16dc4c67516d7e15a6674"} Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.489479 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" podStartSLOduration=1.297369634 podStartE2EDuration="2.489460803s" podCreationTimestamp="2026-02-28 09:30:00 +0000 UTC" firstStartedPulling="2026-02-28 09:30:00.927596143 +0000 UTC m=+1592.618165471" lastFinishedPulling="2026-02-28 09:30:02.119687304 +0000 UTC m=+1593.810256640" observedRunningTime="2026-02-28 09:30:02.484965291 +0000 UTC m=+1594.175534627" watchObservedRunningTime="2026-02-28 09:30:02.489460803 +0000 UTC m=+1594.180030140" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.658243 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:30:02 crc kubenswrapper[4687]: E0228 09:30:02.658699 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.746429 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.775054 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dc8949-e15f-4b19-a04c-7b0997af7ed9-secret-volume\") pod \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.775209 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq76c\" (UniqueName: \"kubernetes.io/projected/34dc8949-e15f-4b19-a04c-7b0997af7ed9-kube-api-access-pq76c\") pod \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.775344 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dc8949-e15f-4b19-a04c-7b0997af7ed9-config-volume\") pod \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\" (UID: \"34dc8949-e15f-4b19-a04c-7b0997af7ed9\") " Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.775829 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34dc8949-e15f-4b19-a04c-7b0997af7ed9-config-volume" (OuterVolumeSpecName: "config-volume") pod "34dc8949-e15f-4b19-a04c-7b0997af7ed9" (UID: "34dc8949-e15f-4b19-a04c-7b0997af7ed9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.776458 4687 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34dc8949-e15f-4b19-a04c-7b0997af7ed9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.780923 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dc8949-e15f-4b19-a04c-7b0997af7ed9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "34dc8949-e15f-4b19-a04c-7b0997af7ed9" (UID: "34dc8949-e15f-4b19-a04c-7b0997af7ed9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.780947 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dc8949-e15f-4b19-a04c-7b0997af7ed9-kube-api-access-pq76c" (OuterVolumeSpecName: "kube-api-access-pq76c") pod "34dc8949-e15f-4b19-a04c-7b0997af7ed9" (UID: "34dc8949-e15f-4b19-a04c-7b0997af7ed9"). InnerVolumeSpecName "kube-api-access-pq76c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.879524 4687 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34dc8949-e15f-4b19-a04c-7b0997af7ed9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:02 crc kubenswrapper[4687]: I0228 09:30:02.879618 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq76c\" (UniqueName: \"kubernetes.io/projected/34dc8949-e15f-4b19-a04c-7b0997af7ed9-kube-api-access-pq76c\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:03 crc kubenswrapper[4687]: I0228 09:30:03.482812 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" event={"ID":"34dc8949-e15f-4b19-a04c-7b0997af7ed9","Type":"ContainerDied","Data":"8299f97ee17c5feec4788c7fd38af1af94375623dfab96ceb0c9df38818c6ed1"} Feb 28 09:30:03 crc kubenswrapper[4687]: I0228 09:30:03.483276 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8299f97ee17c5feec4788c7fd38af1af94375623dfab96ceb0c9df38818c6ed1" Feb 28 09:30:03 crc kubenswrapper[4687]: I0228 09:30:03.482855 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537850-pghm8" Feb 28 09:30:03 crc kubenswrapper[4687]: I0228 09:30:03.486657 4687 generic.go:334] "Generic (PLEG): container finished" podID="e9d983fd-590f-4ca9-8de5-361bc4f3a6f2" containerID="4198b35ab426e3f30912756a63f22176d5af8a8651a16dc4c67516d7e15a6674" exitCode=0 Feb 28 09:30:03 crc kubenswrapper[4687]: I0228 09:30:03.486711 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" event={"ID":"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2","Type":"ContainerDied","Data":"4198b35ab426e3f30912756a63f22176d5af8a8651a16dc4c67516d7e15a6674"} Feb 28 09:30:04 crc kubenswrapper[4687]: I0228 09:30:04.750299 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:04 crc kubenswrapper[4687]: I0228 09:30:04.821164 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48wz\" (UniqueName: \"kubernetes.io/projected/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2-kube-api-access-m48wz\") pod \"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2\" (UID: \"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2\") " Feb 28 09:30:04 crc kubenswrapper[4687]: I0228 09:30:04.830150 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2-kube-api-access-m48wz" (OuterVolumeSpecName: "kube-api-access-m48wz") pod "e9d983fd-590f-4ca9-8de5-361bc4f3a6f2" (UID: "e9d983fd-590f-4ca9-8de5-361bc4f3a6f2"). InnerVolumeSpecName "kube-api-access-m48wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:30:04 crc kubenswrapper[4687]: I0228 09:30:04.923874 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48wz\" (UniqueName: \"kubernetes.io/projected/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2-kube-api-access-m48wz\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:05 crc kubenswrapper[4687]: I0228 09:30:05.505936 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" event={"ID":"e9d983fd-590f-4ca9-8de5-361bc4f3a6f2","Type":"ContainerDied","Data":"fa33015901d3f937db285181ae25964740f02b398f67c1fd8650c0eb1bb83559"} Feb 28 09:30:05 crc kubenswrapper[4687]: I0228 09:30:05.506005 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa33015901d3f937db285181ae25964740f02b398f67c1fd8650c0eb1bb83559" Feb 28 09:30:05 crc kubenswrapper[4687]: I0228 09:30:05.506078 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537850-sgh2x" Feb 28 09:30:05 crc kubenswrapper[4687]: I0228 09:30:05.552619 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537844-nhrcz"] Feb 28 09:30:05 crc kubenswrapper[4687]: I0228 09:30:05.560069 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537844-nhrcz"] Feb 28 09:30:06 crc kubenswrapper[4687]: I0228 09:30:06.667623 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0c3dcc-aa8f-41f1-8014-05bf76455d2a" path="/var/lib/kubelet/pods/ab0c3dcc-aa8f-41f1-8014-05bf76455d2a/volumes" Feb 28 09:30:09 crc kubenswrapper[4687]: I0228 09:30:09.031319 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-br7kf"] Feb 28 09:30:09 crc kubenswrapper[4687]: I0228 09:30:09.034702 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-br7kf"] Feb 28 09:30:10 crc kubenswrapper[4687]: I0228 09:30:10.668766 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268be2d7-dd2e-42f0-b112-230de1abb1d4" path="/var/lib/kubelet/pods/268be2d7-dd2e-42f0-b112-230de1abb1d4/volumes" Feb 28 09:30:16 crc kubenswrapper[4687]: I0228 09:30:16.656567 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:30:16 crc kubenswrapper[4687]: E0228 09:30:16.657244 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:30:25 crc kubenswrapper[4687]: I0228 09:30:25.657766 4687 generic.go:334] "Generic (PLEG): container finished" podID="f3bbc9b7-2863-45fb-a890-fba1253b1f63" containerID="b17ceacc83e2bbeb5ef97bcca07317aa1aa4b07b1e533f8eb57847d55aa08181" exitCode=0 Feb 28 09:30:25 crc kubenswrapper[4687]: I0228 09:30:25.657860 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" event={"ID":"f3bbc9b7-2863-45fb-a890-fba1253b1f63","Type":"ContainerDied","Data":"b17ceacc83e2bbeb5ef97bcca07317aa1aa4b07b1e533f8eb57847d55aa08181"} Feb 28 09:30:26 crc kubenswrapper[4687]: I0228 09:30:26.970203 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.015067 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpbkp\" (UniqueName: \"kubernetes.io/projected/f3bbc9b7-2863-45fb-a890-fba1253b1f63-kube-api-access-dpbkp\") pod \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.015124 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-inventory\") pod \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.015194 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-ssh-key-openstack-edpm-ipam\") pod \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\" (UID: \"f3bbc9b7-2863-45fb-a890-fba1253b1f63\") " Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.019450 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3bbc9b7-2863-45fb-a890-fba1253b1f63-kube-api-access-dpbkp" (OuterVolumeSpecName: "kube-api-access-dpbkp") pod "f3bbc9b7-2863-45fb-a890-fba1253b1f63" (UID: "f3bbc9b7-2863-45fb-a890-fba1253b1f63"). InnerVolumeSpecName "kube-api-access-dpbkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.035696 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-inventory" (OuterVolumeSpecName: "inventory") pod "f3bbc9b7-2863-45fb-a890-fba1253b1f63" (UID: "f3bbc9b7-2863-45fb-a890-fba1253b1f63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.036434 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f3bbc9b7-2863-45fb-a890-fba1253b1f63" (UID: "f3bbc9b7-2863-45fb-a890-fba1253b1f63"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.117298 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.117328 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpbkp\" (UniqueName: \"kubernetes.io/projected/f3bbc9b7-2863-45fb-a890-fba1253b1f63-kube-api-access-dpbkp\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.117338 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3bbc9b7-2863-45fb-a890-fba1253b1f63-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.656404 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:30:27 crc kubenswrapper[4687]: E0228 09:30:27.656896 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.678208 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" event={"ID":"f3bbc9b7-2863-45fb-a890-fba1253b1f63","Type":"ContainerDied","Data":"0a279c5404cd131d5ad776cdb2c19b33b4f500ffc3ee67a02c755f9aa40044eb"} Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.678239 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a279c5404cd131d5ad776cdb2c19b33b4f500ffc3ee67a02c755f9aa40044eb" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.678264 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744121 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc"] Feb 28 09:30:27 crc kubenswrapper[4687]: E0228 09:30:27.744545 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3bbc9b7-2863-45fb-a890-fba1253b1f63" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744567 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3bbc9b7-2863-45fb-a890-fba1253b1f63" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 28 09:30:27 crc kubenswrapper[4687]: E0228 09:30:27.744590 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dc8949-e15f-4b19-a04c-7b0997af7ed9" containerName="collect-profiles" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744597 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dc8949-e15f-4b19-a04c-7b0997af7ed9" containerName="collect-profiles" Feb 28 09:30:27 crc kubenswrapper[4687]: E0228 09:30:27.744614 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d983fd-590f-4ca9-8de5-361bc4f3a6f2" containerName="oc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744620 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d983fd-590f-4ca9-8de5-361bc4f3a6f2" containerName="oc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744820 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dc8949-e15f-4b19-a04c-7b0997af7ed9" containerName="collect-profiles" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744855 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d983fd-590f-4ca9-8de5-361bc4f3a6f2" containerName="oc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.744869 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3bbc9b7-2863-45fb-a890-fba1253b1f63" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.745521 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.747526 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.747677 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.748148 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.750996 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.752462 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc"] Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.827706 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.827893 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.828208 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmrxw\" (UniqueName: \"kubernetes.io/projected/b83907ec-ac55-4f72-9265-e919fa57514a-kube-api-access-qmrxw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.929854 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.929931 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.929996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmrxw\" (UniqueName: \"kubernetes.io/projected/b83907ec-ac55-4f72-9265-e919fa57514a-kube-api-access-qmrxw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.933648 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.933876 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:27 crc kubenswrapper[4687]: I0228 09:30:27.943247 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmrxw\" (UniqueName: \"kubernetes.io/projected/b83907ec-ac55-4f72-9265-e919fa57514a-kube-api-access-qmrxw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zgthc\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.030470 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mcfl6"] Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.037532 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-v6s24"] Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.045122 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-v6s24"] Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.050668 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mcfl6"] Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.070710 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.492096 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc"] Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.664385 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8490bf-32fb-4d04-974d-b2ca311f4b55" path="/var/lib/kubelet/pods/2c8490bf-32fb-4d04-974d-b2ca311f4b55/volumes" Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.665127 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1fa0a3-ab49-4807-a503-3a51a2b70e26" path="/var/lib/kubelet/pods/ef1fa0a3-ab49-4807-a503-3a51a2b70e26/volumes" Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.692835 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" event={"ID":"b83907ec-ac55-4f72-9265-e919fa57514a","Type":"ContainerStarted","Data":"ef79d48923523898d9cdd41f55b4b9608474ebae8ddd63528fec640296b103c4"} Feb 28 09:30:28 crc kubenswrapper[4687]: I0228 09:30:28.991339 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:30:29 crc kubenswrapper[4687]: I0228 09:30:29.701409 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" event={"ID":"b83907ec-ac55-4f72-9265-e919fa57514a","Type":"ContainerStarted","Data":"2fc46d16cea8de8bec472b11e4286862a7697fd51e2392b31953deba803b0b48"} Feb 28 09:30:29 crc kubenswrapper[4687]: I0228 09:30:29.718669 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" podStartSLOduration=2.228393896 podStartE2EDuration="2.718653704s" podCreationTimestamp="2026-02-28 09:30:27 +0000 UTC" firstStartedPulling="2026-02-28 09:30:28.498961425 +0000 UTC m=+1620.189530763" lastFinishedPulling="2026-02-28 09:30:28.989221234 +0000 UTC m=+1620.679790571" observedRunningTime="2026-02-28 09:30:29.716651158 +0000 UTC m=+1621.407220495" watchObservedRunningTime="2026-02-28 09:30:29.718653704 +0000 UTC m=+1621.409223041" Feb 28 09:30:32 crc kubenswrapper[4687]: I0228 09:30:32.723427 4687 generic.go:334] "Generic (PLEG): container finished" podID="b83907ec-ac55-4f72-9265-e919fa57514a" containerID="2fc46d16cea8de8bec472b11e4286862a7697fd51e2392b31953deba803b0b48" exitCode=0 Feb 28 09:30:32 crc kubenswrapper[4687]: I0228 09:30:32.723506 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" event={"ID":"b83907ec-ac55-4f72-9265-e919fa57514a","Type":"ContainerDied","Data":"2fc46d16cea8de8bec472b11e4286862a7697fd51e2392b31953deba803b0b48"} Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.035216 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mvkm8"] Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.042282 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mvkm8"] Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.043470 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.151168 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmrxw\" (UniqueName: \"kubernetes.io/projected/b83907ec-ac55-4f72-9265-e919fa57514a-kube-api-access-qmrxw\") pod \"b83907ec-ac55-4f72-9265-e919fa57514a\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.151240 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-ssh-key-openstack-edpm-ipam\") pod \"b83907ec-ac55-4f72-9265-e919fa57514a\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.152189 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-inventory\") pod \"b83907ec-ac55-4f72-9265-e919fa57514a\" (UID: \"b83907ec-ac55-4f72-9265-e919fa57514a\") " Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.158177 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83907ec-ac55-4f72-9265-e919fa57514a-kube-api-access-qmrxw" (OuterVolumeSpecName: "kube-api-access-qmrxw") pod "b83907ec-ac55-4f72-9265-e919fa57514a" (UID: "b83907ec-ac55-4f72-9265-e919fa57514a"). InnerVolumeSpecName "kube-api-access-qmrxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.176650 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b83907ec-ac55-4f72-9265-e919fa57514a" (UID: "b83907ec-ac55-4f72-9265-e919fa57514a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.177899 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-inventory" (OuterVolumeSpecName: "inventory") pod "b83907ec-ac55-4f72-9265-e919fa57514a" (UID: "b83907ec-ac55-4f72-9265-e919fa57514a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.253763 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmrxw\" (UniqueName: \"kubernetes.io/projected/b83907ec-ac55-4f72-9265-e919fa57514a-kube-api-access-qmrxw\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.253795 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.253831 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b83907ec-ac55-4f72-9265-e919fa57514a-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.664879 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a39679-80b0-4a80-ad64-fe3707c2a9f0" path="/var/lib/kubelet/pods/21a39679-80b0-4a80-ad64-fe3707c2a9f0/volumes" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.738382 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" event={"ID":"b83907ec-ac55-4f72-9265-e919fa57514a","Type":"ContainerDied","Data":"ef79d48923523898d9cdd41f55b4b9608474ebae8ddd63528fec640296b103c4"} Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.738406 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zgthc" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.738421 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef79d48923523898d9cdd41f55b4b9608474ebae8ddd63528fec640296b103c4" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.800155 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq"] Feb 28 09:30:34 crc kubenswrapper[4687]: E0228 09:30:34.800568 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83907ec-ac55-4f72-9265-e919fa57514a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.800590 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83907ec-ac55-4f72-9265-e919fa57514a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.800774 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83907ec-ac55-4f72-9265-e919fa57514a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.801378 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.804338 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.804807 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.805911 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.805995 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.806773 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq"] Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.862674 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdh4\" (UniqueName: \"kubernetes.io/projected/380b1201-b6ba-48e4-b282-fad4f9b945d7-kube-api-access-4cdh4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.862742 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.862842 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.964336 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdh4\" (UniqueName: \"kubernetes.io/projected/380b1201-b6ba-48e4-b282-fad4f9b945d7-kube-api-access-4cdh4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.964402 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.964445 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.969395 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.970037 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:34 crc kubenswrapper[4687]: I0228 09:30:34.979558 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdh4\" (UniqueName: \"kubernetes.io/projected/380b1201-b6ba-48e4-b282-fad4f9b945d7-kube-api-access-4cdh4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gdmsq\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:35 crc kubenswrapper[4687]: I0228 09:30:35.113555 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:30:35 crc kubenswrapper[4687]: I0228 09:30:35.551914 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq"] Feb 28 09:30:35 crc kubenswrapper[4687]: I0228 09:30:35.745241 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" event={"ID":"380b1201-b6ba-48e4-b282-fad4f9b945d7","Type":"ContainerStarted","Data":"1c8a65d8180b1886f67a3d05c0251cc33388130577c23356c2c4a74b4e3cacd5"} Feb 28 09:30:36 crc kubenswrapper[4687]: I0228 09:30:36.754069 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" event={"ID":"380b1201-b6ba-48e4-b282-fad4f9b945d7","Type":"ContainerStarted","Data":"710deaf1dd2361f05b301ce20edbb32a034243ef82ea56111bcc74c44e33c0dd"} Feb 28 09:30:36 crc kubenswrapper[4687]: I0228 09:30:36.794749 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" podStartSLOduration=2.347693693 podStartE2EDuration="2.794723206s" podCreationTimestamp="2026-02-28 09:30:34 +0000 UTC" firstStartedPulling="2026-02-28 09:30:35.562412672 +0000 UTC m=+1627.252982009" lastFinishedPulling="2026-02-28 09:30:36.009442195 +0000 UTC m=+1627.700011522" observedRunningTime="2026-02-28 09:30:36.783266313 +0000 UTC m=+1628.473835650" watchObservedRunningTime="2026-02-28 09:30:36.794723206 +0000 UTC m=+1628.485292543" Feb 28 09:30:39 crc kubenswrapper[4687]: I0228 09:30:39.030323 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-c9j72"] Feb 28 09:30:39 crc kubenswrapper[4687]: I0228 09:30:39.037402 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-c9j72"] Feb 28 09:30:39 crc kubenswrapper[4687]: I0228 09:30:39.656724 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:30:39 crc kubenswrapper[4687]: E0228 09:30:39.657005 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:30:40 crc kubenswrapper[4687]: I0228 09:30:40.666340 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e5e221e-73c7-44a2-9af9-0feb60b412e0" path="/var/lib/kubelet/pods/3e5e221e-73c7-44a2-9af9-0feb60b412e0/volumes" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.085512 4687 scope.go:117] "RemoveContainer" containerID="36d8793b5506960f0edd95fae453cc7431c4d82d7aee4458db381af12f245d6b" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.117689 4687 scope.go:117] "RemoveContainer" containerID="4256a783c481eb3f81ab67e9750afeddc57187e7417b95fb7a456df1df32422b" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.138972 4687 scope.go:117] "RemoveContainer" containerID="6797815408b770565467666b143e0cd011b644fdb43e312af592f4cf558f9d0f" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.171182 4687 scope.go:117] "RemoveContainer" containerID="009568f75339d1a8c3c9123c3946d8b90dfafcad46f65b2258c839ba6da203dd" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.217630 4687 scope.go:117] "RemoveContainer" containerID="ffaa6edabe26e3187e8637f65143ca2be45488d26ff58086f7d34291009d8496" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.254720 4687 scope.go:117] "RemoveContainer" containerID="7a9df33dd1fc3f826946d659a6953fd6949e61530dfd83b530089fd5e576317d" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.271005 4687 scope.go:117] "RemoveContainer" containerID="8d1ee149385bb5d905ef6542e2279421190aa48afdf2729a635bca659f0a9f22" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.291413 4687 scope.go:117] "RemoveContainer" containerID="1b40e63fdc515073d911fb94bcf36a2b6d45554c85eea334cfe3d4e5db74cfbc" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.316849 4687 scope.go:117] "RemoveContainer" containerID="f4a255a39b6cee4bfde8c5ec52bb5a6138c52ca1ca5f193e4739b9beb18d718a" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.340822 4687 scope.go:117] "RemoveContainer" containerID="cfc614ff4012eeea4ee2bb3b218f0501a82c0b2893e9d3698195e010681fb7c8" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.355142 4687 scope.go:117] "RemoveContainer" containerID="a793e5c63138041812e503b2870f415a857f4e1067bc1332320a92a2b68438a2" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.368982 4687 scope.go:117] "RemoveContainer" containerID="eafa8aa189b8dc205e3d2d50f710b6dc7d3538bb809e3c71ada4b453013fb30d" Feb 28 09:30:42 crc kubenswrapper[4687]: I0228 09:30:42.388882 4687 scope.go:117] "RemoveContainer" containerID="d9577ae41f570588fc2e1468a12933ca2e321fe2a49078f3670e73d6dc1d2931" Feb 28 09:30:54 crc kubenswrapper[4687]: I0228 09:30:54.657719 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:30:54 crc kubenswrapper[4687]: E0228 09:30:54.658271 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:31:01 crc kubenswrapper[4687]: I0228 09:31:01.933411 4687 generic.go:334] "Generic (PLEG): container finished" podID="380b1201-b6ba-48e4-b282-fad4f9b945d7" containerID="710deaf1dd2361f05b301ce20edbb32a034243ef82ea56111bcc74c44e33c0dd" exitCode=0 Feb 28 09:31:01 crc kubenswrapper[4687]: I0228 09:31:01.933497 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" event={"ID":"380b1201-b6ba-48e4-b282-fad4f9b945d7","Type":"ContainerDied","Data":"710deaf1dd2361f05b301ce20edbb32a034243ef82ea56111bcc74c44e33c0dd"} Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.236452 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.242613 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cdh4\" (UniqueName: \"kubernetes.io/projected/380b1201-b6ba-48e4-b282-fad4f9b945d7-kube-api-access-4cdh4\") pod \"380b1201-b6ba-48e4-b282-fad4f9b945d7\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.242692 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-inventory\") pod \"380b1201-b6ba-48e4-b282-fad4f9b945d7\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.242777 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-ssh-key-openstack-edpm-ipam\") pod \"380b1201-b6ba-48e4-b282-fad4f9b945d7\" (UID: \"380b1201-b6ba-48e4-b282-fad4f9b945d7\") " Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.247258 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/380b1201-b6ba-48e4-b282-fad4f9b945d7-kube-api-access-4cdh4" (OuterVolumeSpecName: "kube-api-access-4cdh4") pod "380b1201-b6ba-48e4-b282-fad4f9b945d7" (UID: "380b1201-b6ba-48e4-b282-fad4f9b945d7"). InnerVolumeSpecName "kube-api-access-4cdh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.264717 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-inventory" (OuterVolumeSpecName: "inventory") pod "380b1201-b6ba-48e4-b282-fad4f9b945d7" (UID: "380b1201-b6ba-48e4-b282-fad4f9b945d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.265115 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "380b1201-b6ba-48e4-b282-fad4f9b945d7" (UID: "380b1201-b6ba-48e4-b282-fad4f9b945d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.345550 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cdh4\" (UniqueName: \"kubernetes.io/projected/380b1201-b6ba-48e4-b282-fad4f9b945d7-kube-api-access-4cdh4\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.345578 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.345590 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/380b1201-b6ba-48e4-b282-fad4f9b945d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.947713 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" event={"ID":"380b1201-b6ba-48e4-b282-fad4f9b945d7","Type":"ContainerDied","Data":"1c8a65d8180b1886f67a3d05c0251cc33388130577c23356c2c4a74b4e3cacd5"} Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.947903 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c8a65d8180b1886f67a3d05c0251cc33388130577c23356c2c4a74b4e3cacd5" Feb 28 09:31:03 crc kubenswrapper[4687]: I0228 09:31:03.947776 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gdmsq" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.008999 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5"] Feb 28 09:31:04 crc kubenswrapper[4687]: E0228 09:31:04.009444 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="380b1201-b6ba-48e4-b282-fad4f9b945d7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.009461 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="380b1201-b6ba-48e4-b282-fad4f9b945d7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.009612 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="380b1201-b6ba-48e4-b282-fad4f9b945d7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.010292 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.014643 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.014797 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.015465 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.017171 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.020689 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5"] Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.157175 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmjh7\" (UniqueName: \"kubernetes.io/projected/8a947fbc-4fb5-4be7-819c-703c45480b29-kube-api-access-kmjh7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.157224 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.157283 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.258555 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmjh7\" (UniqueName: \"kubernetes.io/projected/8a947fbc-4fb5-4be7-819c-703c45480b29-kube-api-access-kmjh7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.259136 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.259295 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.262442 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.262581 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.273624 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmjh7\" (UniqueName: \"kubernetes.io/projected/8a947fbc-4fb5-4be7-819c-703c45480b29-kube-api-access-kmjh7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.335502 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.752086 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5"] Feb 28 09:31:04 crc kubenswrapper[4687]: I0228 09:31:04.954908 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" event={"ID":"8a947fbc-4fb5-4be7-819c-703c45480b29","Type":"ContainerStarted","Data":"ae2c1f9f626de8f1bd9aee21856bee0648aa1d6482a167e6c42f9b83f6f58c0e"} Feb 28 09:31:05 crc kubenswrapper[4687]: I0228 09:31:05.962130 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" event={"ID":"8a947fbc-4fb5-4be7-819c-703c45480b29","Type":"ContainerStarted","Data":"e4ec879b9d891dde3ff1ef416309afd689cffc6a615b4792a524e2c34c7e5582"} Feb 28 09:31:05 crc kubenswrapper[4687]: I0228 09:31:05.975588 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" podStartSLOduration=2.465210962 podStartE2EDuration="2.975578388s" podCreationTimestamp="2026-02-28 09:31:03 +0000 UTC" firstStartedPulling="2026-02-28 09:31:04.752043604 +0000 UTC m=+1656.442612941" lastFinishedPulling="2026-02-28 09:31:05.26241103 +0000 UTC m=+1656.952980367" observedRunningTime="2026-02-28 09:31:05.972172834 +0000 UTC m=+1657.662742171" watchObservedRunningTime="2026-02-28 09:31:05.975578388 +0000 UTC m=+1657.666147725" Feb 28 09:31:08 crc kubenswrapper[4687]: I0228 09:31:08.661358 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:31:08 crc kubenswrapper[4687]: E0228 09:31:08.662051 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:31:21 crc kubenswrapper[4687]: I0228 09:31:21.029458 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a257-account-create-update-9t7cz"] Feb 28 09:31:21 crc kubenswrapper[4687]: I0228 09:31:21.036137 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a257-account-create-update-9t7cz"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.029208 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-68b5-account-create-update-zv5jr"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.034810 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d297-account-create-update-688sh"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.042438 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fzwm9"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.048232 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-86rbf"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.053919 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d297-account-create-update-688sh"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.059329 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6qwj6"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.064722 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fzwm9"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.069433 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-86rbf"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.076634 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6qwj6"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.081700 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-68b5-account-create-update-zv5jr"] Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.669624 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131e7bdc-bd19-4a7e-b0ad-a561c7f3a857" path="/var/lib/kubelet/pods/131e7bdc-bd19-4a7e-b0ad-a561c7f3a857/volumes" Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.670346 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5777779d-582f-4e60-ac7f-e194408c31eb" path="/var/lib/kubelet/pods/5777779d-582f-4e60-ac7f-e194408c31eb/volumes" Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.670993 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8523b7a8-45d6-4708-b1e7-4c3dbb505640" path="/var/lib/kubelet/pods/8523b7a8-45d6-4708-b1e7-4c3dbb505640/volumes" Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.671639 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9159f256-61e8-41bc-bceb-d602b568ef60" path="/var/lib/kubelet/pods/9159f256-61e8-41bc-bceb-d602b568ef60/volumes" Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.673187 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1143c7-db81-4638-ad50-a1d7d26d9ad7" path="/var/lib/kubelet/pods/dc1143c7-db81-4638-ad50-a1d7d26d9ad7/volumes" Feb 28 09:31:22 crc kubenswrapper[4687]: I0228 09:31:22.674464 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcaf528f-cd30-4024-b73f-da1ac741ee53" path="/var/lib/kubelet/pods/fcaf528f-cd30-4024-b73f-da1ac741ee53/volumes" Feb 28 09:31:23 crc kubenswrapper[4687]: I0228 09:31:23.657177 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:31:23 crc kubenswrapper[4687]: E0228 09:31:23.657861 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:31:34 crc kubenswrapper[4687]: I0228 09:31:34.657301 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:31:34 crc kubenswrapper[4687]: E0228 09:31:34.659135 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:31:39 crc kubenswrapper[4687]: I0228 09:31:39.210653 4687 generic.go:334] "Generic (PLEG): container finished" podID="8a947fbc-4fb5-4be7-819c-703c45480b29" containerID="e4ec879b9d891dde3ff1ef416309afd689cffc6a615b4792a524e2c34c7e5582" exitCode=0 Feb 28 09:31:39 crc kubenswrapper[4687]: I0228 09:31:39.210811 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" event={"ID":"8a947fbc-4fb5-4be7-819c-703c45480b29","Type":"ContainerDied","Data":"e4ec879b9d891dde3ff1ef416309afd689cffc6a615b4792a524e2c34c7e5582"} Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.545352 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.733522 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-ssh-key-openstack-edpm-ipam\") pod \"8a947fbc-4fb5-4be7-819c-703c45480b29\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.733574 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-inventory\") pod \"8a947fbc-4fb5-4be7-819c-703c45480b29\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.733638 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmjh7\" (UniqueName: \"kubernetes.io/projected/8a947fbc-4fb5-4be7-819c-703c45480b29-kube-api-access-kmjh7\") pod \"8a947fbc-4fb5-4be7-819c-703c45480b29\" (UID: \"8a947fbc-4fb5-4be7-819c-703c45480b29\") " Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.750407 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a947fbc-4fb5-4be7-819c-703c45480b29-kube-api-access-kmjh7" (OuterVolumeSpecName: "kube-api-access-kmjh7") pod "8a947fbc-4fb5-4be7-819c-703c45480b29" (UID: "8a947fbc-4fb5-4be7-819c-703c45480b29"). InnerVolumeSpecName "kube-api-access-kmjh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.754305 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a947fbc-4fb5-4be7-819c-703c45480b29" (UID: "8a947fbc-4fb5-4be7-819c-703c45480b29"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.754664 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-inventory" (OuterVolumeSpecName: "inventory") pod "8a947fbc-4fb5-4be7-819c-703c45480b29" (UID: "8a947fbc-4fb5-4be7-819c-703c45480b29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.837301 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.837934 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a947fbc-4fb5-4be7-819c-703c45480b29-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:40 crc kubenswrapper[4687]: I0228 09:31:40.837972 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmjh7\" (UniqueName: \"kubernetes.io/projected/8a947fbc-4fb5-4be7-819c-703c45480b29-kube-api-access-kmjh7\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.035216 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dm25t"] Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.040029 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dm25t"] Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.238213 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" event={"ID":"8a947fbc-4fb5-4be7-819c-703c45480b29","Type":"ContainerDied","Data":"ae2c1f9f626de8f1bd9aee21856bee0648aa1d6482a167e6c42f9b83f6f58c0e"} Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.238284 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.238291 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae2c1f9f626de8f1bd9aee21856bee0648aa1d6482a167e6c42f9b83f6f58c0e" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.293240 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-989zc"] Feb 28 09:31:41 crc kubenswrapper[4687]: E0228 09:31:41.293725 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a947fbc-4fb5-4be7-819c-703c45480b29" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.293743 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a947fbc-4fb5-4be7-819c-703c45480b29" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.293924 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a947fbc-4fb5-4be7-819c-703c45480b29" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.294629 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.296460 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.296530 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.296536 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.297604 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.301050 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-989zc"] Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.448980 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x6dl\" (UniqueName: \"kubernetes.io/projected/a605b600-b94d-4f23-9922-f9d8478cf6ef-kube-api-access-2x6dl\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.449317 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.449519 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.552549 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.552635 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.552755 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x6dl\" (UniqueName: \"kubernetes.io/projected/a605b600-b94d-4f23-9922-f9d8478cf6ef-kube-api-access-2x6dl\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.558381 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.563601 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.576874 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x6dl\" (UniqueName: \"kubernetes.io/projected/a605b600-b94d-4f23-9922-f9d8478cf6ef-kube-api-access-2x6dl\") pod \"ssh-known-hosts-edpm-deployment-989zc\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:41 crc kubenswrapper[4687]: I0228 09:31:41.606965 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.057011 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-989zc"] Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.248278 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" event={"ID":"a605b600-b94d-4f23-9922-f9d8478cf6ef","Type":"ContainerStarted","Data":"c90494c64db22abbe0eb7c7dae208735cc745d58ac3d4dc323513a38f774464e"} Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.553650 4687 scope.go:117] "RemoveContainer" containerID="c24080e76861b55cba45b319deac146d73a4f68b51069d4b4ce6a2e35a9bc587" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.637864 4687 scope.go:117] "RemoveContainer" containerID="15b1e63e4e21a967e9268dfa26574c0b6305a6b5de1a1d1cc14160fdab783c24" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.662154 4687 scope.go:117] "RemoveContainer" containerID="0578526f9046619d023056e30230dbbd0cb87d8a6b2896535f311cb8c67a00c8" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.670589 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c40a499-8f9a-4d0e-b266-4a5defbb7e22" path="/var/lib/kubelet/pods/1c40a499-8f9a-4d0e-b266-4a5defbb7e22/volumes" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.737996 4687 scope.go:117] "RemoveContainer" containerID="9b5fd43c0e428bad6cdca44f3ec6b58781eea24cec60383f60b95a07397f8736" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.764388 4687 scope.go:117] "RemoveContainer" containerID="b64dea1d776a999a2a152ba3b9c5b54f53b61d67f6df9c676d2569c3c14be455" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.787964 4687 scope.go:117] "RemoveContainer" containerID="879a5a4f6bce04b8d0f622b16c280ec2579a68147499ef39475b3805b16f0c0d" Feb 28 09:31:42 crc kubenswrapper[4687]: I0228 09:31:42.806731 4687 scope.go:117] "RemoveContainer" containerID="3a89bc6486a14484de7179d420bf770c0d6e6f262f92b3f6fe6bfaee21fd64a8" Feb 28 09:31:43 crc kubenswrapper[4687]: I0228 09:31:43.256357 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" event={"ID":"a605b600-b94d-4f23-9922-f9d8478cf6ef","Type":"ContainerStarted","Data":"754687cf45d6f194c9cd76e3c25d1a088e9302b1849529856d82652f2e3ce149"} Feb 28 09:31:43 crc kubenswrapper[4687]: I0228 09:31:43.277973 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" podStartSLOduration=1.809971354 podStartE2EDuration="2.277956282s" podCreationTimestamp="2026-02-28 09:31:41 +0000 UTC" firstStartedPulling="2026-02-28 09:31:42.058324885 +0000 UTC m=+1693.748894222" lastFinishedPulling="2026-02-28 09:31:42.526309814 +0000 UTC m=+1694.216879150" observedRunningTime="2026-02-28 09:31:43.267138855 +0000 UTC m=+1694.957708191" watchObservedRunningTime="2026-02-28 09:31:43.277956282 +0000 UTC m=+1694.968525619" Feb 28 09:31:47 crc kubenswrapper[4687]: I0228 09:31:47.657936 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:31:47 crc kubenswrapper[4687]: E0228 09:31:47.658542 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:31:48 crc kubenswrapper[4687]: I0228 09:31:48.309912 4687 generic.go:334] "Generic (PLEG): container finished" podID="a605b600-b94d-4f23-9922-f9d8478cf6ef" containerID="754687cf45d6f194c9cd76e3c25d1a088e9302b1849529856d82652f2e3ce149" exitCode=0 Feb 28 09:31:48 crc kubenswrapper[4687]: I0228 09:31:48.309968 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" event={"ID":"a605b600-b94d-4f23-9922-f9d8478cf6ef","Type":"ContainerDied","Data":"754687cf45d6f194c9cd76e3c25d1a088e9302b1849529856d82652f2e3ce149"} Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.666858 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.822742 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x6dl\" (UniqueName: \"kubernetes.io/projected/a605b600-b94d-4f23-9922-f9d8478cf6ef-kube-api-access-2x6dl\") pod \"a605b600-b94d-4f23-9922-f9d8478cf6ef\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.822921 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-ssh-key-openstack-edpm-ipam\") pod \"a605b600-b94d-4f23-9922-f9d8478cf6ef\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.823049 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-inventory-0\") pod \"a605b600-b94d-4f23-9922-f9d8478cf6ef\" (UID: \"a605b600-b94d-4f23-9922-f9d8478cf6ef\") " Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.829263 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a605b600-b94d-4f23-9922-f9d8478cf6ef-kube-api-access-2x6dl" (OuterVolumeSpecName: "kube-api-access-2x6dl") pod "a605b600-b94d-4f23-9922-f9d8478cf6ef" (UID: "a605b600-b94d-4f23-9922-f9d8478cf6ef"). InnerVolumeSpecName "kube-api-access-2x6dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.845686 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a605b600-b94d-4f23-9922-f9d8478cf6ef" (UID: "a605b600-b94d-4f23-9922-f9d8478cf6ef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.846449 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a605b600-b94d-4f23-9922-f9d8478cf6ef" (UID: "a605b600-b94d-4f23-9922-f9d8478cf6ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.926014 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x6dl\" (UniqueName: \"kubernetes.io/projected/a605b600-b94d-4f23-9922-f9d8478cf6ef-kube-api-access-2x6dl\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.926070 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:49 crc kubenswrapper[4687]: I0228 09:31:49.926085 4687 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a605b600-b94d-4f23-9922-f9d8478cf6ef-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.334553 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" event={"ID":"a605b600-b94d-4f23-9922-f9d8478cf6ef","Type":"ContainerDied","Data":"c90494c64db22abbe0eb7c7dae208735cc745d58ac3d4dc323513a38f774464e"} Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.334597 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90494c64db22abbe0eb7c7dae208735cc745d58ac3d4dc323513a38f774464e" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.334607 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-989zc" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.394454 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw"] Feb 28 09:31:50 crc kubenswrapper[4687]: E0228 09:31:50.395240 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a605b600-b94d-4f23-9922-f9d8478cf6ef" containerName="ssh-known-hosts-edpm-deployment" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.395262 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a605b600-b94d-4f23-9922-f9d8478cf6ef" containerName="ssh-known-hosts-edpm-deployment" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.395507 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="a605b600-b94d-4f23-9922-f9d8478cf6ef" containerName="ssh-known-hosts-edpm-deployment" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.396192 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.402216 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.403040 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw"] Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.403253 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.405539 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.407401 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.437266 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4trz\" (UniqueName: \"kubernetes.io/projected/47d00581-22fa-4c52-a057-6d757f969f52-kube-api-access-j4trz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.437310 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.437543 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.538785 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.538890 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4trz\" (UniqueName: \"kubernetes.io/projected/47d00581-22fa-4c52-a057-6d757f969f52-kube-api-access-j4trz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.538928 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.543195 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.543930 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.555918 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4trz\" (UniqueName: \"kubernetes.io/projected/47d00581-22fa-4c52-a057-6d757f969f52-kube-api-access-j4trz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bn6fw\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:50 crc kubenswrapper[4687]: I0228 09:31:50.709878 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:51 crc kubenswrapper[4687]: I0228 09:31:51.157915 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw"] Feb 28 09:31:51 crc kubenswrapper[4687]: I0228 09:31:51.353340 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" event={"ID":"47d00581-22fa-4c52-a057-6d757f969f52","Type":"ContainerStarted","Data":"d63f813a6f47b9fd0410d9fe47b18e04e41ded174513bc0ba44da2af70f24250"} Feb 28 09:31:52 crc kubenswrapper[4687]: I0228 09:31:52.364261 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" event={"ID":"47d00581-22fa-4c52-a057-6d757f969f52","Type":"ContainerStarted","Data":"e38d62b762e1dfcb62ab053914fc4a571682c26505fa26a5364bc7a5766e00d3"} Feb 28 09:31:52 crc kubenswrapper[4687]: I0228 09:31:52.385388 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" podStartSLOduration=1.938019489 podStartE2EDuration="2.385372559s" podCreationTimestamp="2026-02-28 09:31:50 +0000 UTC" firstStartedPulling="2026-02-28 09:31:51.161975744 +0000 UTC m=+1702.852545081" lastFinishedPulling="2026-02-28 09:31:51.609328814 +0000 UTC m=+1703.299898151" observedRunningTime="2026-02-28 09:31:52.378566364 +0000 UTC m=+1704.069135701" watchObservedRunningTime="2026-02-28 09:31:52.385372559 +0000 UTC m=+1704.075941886" Feb 28 09:31:58 crc kubenswrapper[4687]: I0228 09:31:58.421502 4687 generic.go:334] "Generic (PLEG): container finished" podID="47d00581-22fa-4c52-a057-6d757f969f52" containerID="e38d62b762e1dfcb62ab053914fc4a571682c26505fa26a5364bc7a5766e00d3" exitCode=0 Feb 28 09:31:58 crc kubenswrapper[4687]: I0228 09:31:58.421539 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" event={"ID":"47d00581-22fa-4c52-a057-6d757f969f52","Type":"ContainerDied","Data":"e38d62b762e1dfcb62ab053914fc4a571682c26505fa26a5364bc7a5766e00d3"} Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.778812 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.939894 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4trz\" (UniqueName: \"kubernetes.io/projected/47d00581-22fa-4c52-a057-6d757f969f52-kube-api-access-j4trz\") pod \"47d00581-22fa-4c52-a057-6d757f969f52\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.940177 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-ssh-key-openstack-edpm-ipam\") pod \"47d00581-22fa-4c52-a057-6d757f969f52\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.940254 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-inventory\") pod \"47d00581-22fa-4c52-a057-6d757f969f52\" (UID: \"47d00581-22fa-4c52-a057-6d757f969f52\") " Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.946379 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d00581-22fa-4c52-a057-6d757f969f52-kube-api-access-j4trz" (OuterVolumeSpecName: "kube-api-access-j4trz") pod "47d00581-22fa-4c52-a057-6d757f969f52" (UID: "47d00581-22fa-4c52-a057-6d757f969f52"). InnerVolumeSpecName "kube-api-access-j4trz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.977552 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-inventory" (OuterVolumeSpecName: "inventory") pod "47d00581-22fa-4c52-a057-6d757f969f52" (UID: "47d00581-22fa-4c52-a057-6d757f969f52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:31:59 crc kubenswrapper[4687]: I0228 09:31:59.979889 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "47d00581-22fa-4c52-a057-6d757f969f52" (UID: "47d00581-22fa-4c52-a057-6d757f969f52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.028484 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kz8x8"] Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.037332 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kz8x8"] Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.043366 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.043403 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47d00581-22fa-4c52-a057-6d757f969f52-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.043415 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4trz\" (UniqueName: \"kubernetes.io/projected/47d00581-22fa-4c52-a057-6d757f969f52-kube-api-access-j4trz\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.140984 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537852-2zdp6"] Feb 28 09:32:00 crc kubenswrapper[4687]: E0228 09:32:00.141474 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d00581-22fa-4c52-a057-6d757f969f52" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.141494 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d00581-22fa-4c52-a057-6d757f969f52" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.141681 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d00581-22fa-4c52-a057-6d757f969f52" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.142365 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.145138 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.146269 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.146367 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.148927 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537852-2zdp6"] Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.248058 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7xz7\" (UniqueName: \"kubernetes.io/projected/958ee491-1300-475e-9410-521ebb3f5078-kube-api-access-d7xz7\") pod \"auto-csr-approver-29537852-2zdp6\" (UID: \"958ee491-1300-475e-9410-521ebb3f5078\") " pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.349568 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7xz7\" (UniqueName: \"kubernetes.io/projected/958ee491-1300-475e-9410-521ebb3f5078-kube-api-access-d7xz7\") pod \"auto-csr-approver-29537852-2zdp6\" (UID: \"958ee491-1300-475e-9410-521ebb3f5078\") " pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.365082 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7xz7\" (UniqueName: \"kubernetes.io/projected/958ee491-1300-475e-9410-521ebb3f5078-kube-api-access-d7xz7\") pod \"auto-csr-approver-29537852-2zdp6\" (UID: \"958ee491-1300-475e-9410-521ebb3f5078\") " pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.441971 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" event={"ID":"47d00581-22fa-4c52-a057-6d757f969f52","Type":"ContainerDied","Data":"d63f813a6f47b9fd0410d9fe47b18e04e41ded174513bc0ba44da2af70f24250"} Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.442042 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d63f813a6f47b9fd0410d9fe47b18e04e41ded174513bc0ba44da2af70f24250" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.442056 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bn6fw" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.455654 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.512234 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd"] Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.513683 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.515643 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.515846 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.515894 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.515972 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.517789 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd"] Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.662730 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wwm\" (UniqueName: \"kubernetes.io/projected/2bb3057f-10bb-43e9-af01-41131c5b6fb1-kube-api-access-w9wwm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.663007 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.663083 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.689178 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f368345f-9e9f-448e-af56-24950cc3b1f9" path="/var/lib/kubelet/pods/f368345f-9e9f-448e-af56-24950cc3b1f9/volumes" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.765320 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wwm\" (UniqueName: \"kubernetes.io/projected/2bb3057f-10bb-43e9-af01-41131c5b6fb1-kube-api-access-w9wwm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.765419 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.765505 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.774607 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.780521 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.798739 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wwm\" (UniqueName: \"kubernetes.io/projected/2bb3057f-10bb-43e9-af01-41131c5b6fb1-kube-api-access-w9wwm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.828849 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:00 crc kubenswrapper[4687]: I0228 09:32:00.857522 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537852-2zdp6"] Feb 28 09:32:01 crc kubenswrapper[4687]: I0228 09:32:01.040890 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-trkpz"] Feb 28 09:32:01 crc kubenswrapper[4687]: I0228 09:32:01.046093 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-trkpz"] Feb 28 09:32:01 crc kubenswrapper[4687]: I0228 09:32:01.283993 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd"] Feb 28 09:32:01 crc kubenswrapper[4687]: W0228 09:32:01.285635 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb3057f_10bb_43e9_af01_41131c5b6fb1.slice/crio-e97f041ff4904503be2fd108c4d5edfa329e06492b7e949bcb2e92e8020ec766 WatchSource:0}: Error finding container e97f041ff4904503be2fd108c4d5edfa329e06492b7e949bcb2e92e8020ec766: Status 404 returned error can't find the container with id e97f041ff4904503be2fd108c4d5edfa329e06492b7e949bcb2e92e8020ec766 Feb 28 09:32:01 crc kubenswrapper[4687]: I0228 09:32:01.453723 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" event={"ID":"2bb3057f-10bb-43e9-af01-41131c5b6fb1","Type":"ContainerStarted","Data":"e97f041ff4904503be2fd108c4d5edfa329e06492b7e949bcb2e92e8020ec766"} Feb 28 09:32:01 crc kubenswrapper[4687]: I0228 09:32:01.455222 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" event={"ID":"958ee491-1300-475e-9410-521ebb3f5078","Type":"ContainerStarted","Data":"b457d8a0661d49fceafe6c8325db7eb44a3607636f154c6e48b5d192148f0dc3"} Feb 28 09:32:01 crc kubenswrapper[4687]: I0228 09:32:01.657844 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:32:01 crc kubenswrapper[4687]: E0228 09:32:01.658553 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:32:02 crc kubenswrapper[4687]: I0228 09:32:02.466050 4687 generic.go:334] "Generic (PLEG): container finished" podID="958ee491-1300-475e-9410-521ebb3f5078" containerID="0cf4c155a01ed143e132c679c863a135ae9ab88907072c7272392f9baebda177" exitCode=0 Feb 28 09:32:02 crc kubenswrapper[4687]: I0228 09:32:02.466142 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" event={"ID":"958ee491-1300-475e-9410-521ebb3f5078","Type":"ContainerDied","Data":"0cf4c155a01ed143e132c679c863a135ae9ab88907072c7272392f9baebda177"} Feb 28 09:32:02 crc kubenswrapper[4687]: I0228 09:32:02.468246 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" event={"ID":"2bb3057f-10bb-43e9-af01-41131c5b6fb1","Type":"ContainerStarted","Data":"12de39b64027b4673c018330d820e715aec797380dfd42b4ad33cf3ed2744c7e"} Feb 28 09:32:02 crc kubenswrapper[4687]: I0228 09:32:02.516257 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" podStartSLOduration=1.961174909 podStartE2EDuration="2.51623775s" podCreationTimestamp="2026-02-28 09:32:00 +0000 UTC" firstStartedPulling="2026-02-28 09:32:01.287697044 +0000 UTC m=+1712.978266380" lastFinishedPulling="2026-02-28 09:32:01.842759884 +0000 UTC m=+1713.533329221" observedRunningTime="2026-02-28 09:32:02.507148461 +0000 UTC m=+1714.197717799" watchObservedRunningTime="2026-02-28 09:32:02.51623775 +0000 UTC m=+1714.206807087" Feb 28 09:32:02 crc kubenswrapper[4687]: I0228 09:32:02.666515 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cfad4a9-c499-491b-bc53-5346948e6e2a" path="/var/lib/kubelet/pods/3cfad4a9-c499-491b-bc53-5346948e6e2a/volumes" Feb 28 09:32:03 crc kubenswrapper[4687]: I0228 09:32:03.775295 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:03 crc kubenswrapper[4687]: I0228 09:32:03.930121 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7xz7\" (UniqueName: \"kubernetes.io/projected/958ee491-1300-475e-9410-521ebb3f5078-kube-api-access-d7xz7\") pod \"958ee491-1300-475e-9410-521ebb3f5078\" (UID: \"958ee491-1300-475e-9410-521ebb3f5078\") " Feb 28 09:32:03 crc kubenswrapper[4687]: I0228 09:32:03.937287 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958ee491-1300-475e-9410-521ebb3f5078-kube-api-access-d7xz7" (OuterVolumeSpecName: "kube-api-access-d7xz7") pod "958ee491-1300-475e-9410-521ebb3f5078" (UID: "958ee491-1300-475e-9410-521ebb3f5078"). InnerVolumeSpecName "kube-api-access-d7xz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:04 crc kubenswrapper[4687]: I0228 09:32:04.033927 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7xz7\" (UniqueName: \"kubernetes.io/projected/958ee491-1300-475e-9410-521ebb3f5078-kube-api-access-d7xz7\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:04 crc kubenswrapper[4687]: I0228 09:32:04.487379 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" event={"ID":"958ee491-1300-475e-9410-521ebb3f5078","Type":"ContainerDied","Data":"b457d8a0661d49fceafe6c8325db7eb44a3607636f154c6e48b5d192148f0dc3"} Feb 28 09:32:04 crc kubenswrapper[4687]: I0228 09:32:04.487426 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b457d8a0661d49fceafe6c8325db7eb44a3607636f154c6e48b5d192148f0dc3" Feb 28 09:32:04 crc kubenswrapper[4687]: I0228 09:32:04.487512 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537852-2zdp6" Feb 28 09:32:04 crc kubenswrapper[4687]: I0228 09:32:04.831964 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537846-hkvwm"] Feb 28 09:32:04 crc kubenswrapper[4687]: I0228 09:32:04.839575 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537846-hkvwm"] Feb 28 09:32:06 crc kubenswrapper[4687]: I0228 09:32:06.667005 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d025ea4-23bc-45b7-b5c3-4f35b3d9d431" path="/var/lib/kubelet/pods/3d025ea4-23bc-45b7-b5c3-4f35b3d9d431/volumes" Feb 28 09:32:09 crc kubenswrapper[4687]: I0228 09:32:09.534124 4687 generic.go:334] "Generic (PLEG): container finished" podID="2bb3057f-10bb-43e9-af01-41131c5b6fb1" containerID="12de39b64027b4673c018330d820e715aec797380dfd42b4ad33cf3ed2744c7e" exitCode=0 Feb 28 09:32:09 crc kubenswrapper[4687]: I0228 09:32:09.534225 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" event={"ID":"2bb3057f-10bb-43e9-af01-41131c5b6fb1","Type":"ContainerDied","Data":"12de39b64027b4673c018330d820e715aec797380dfd42b4ad33cf3ed2744c7e"} Feb 28 09:32:10 crc kubenswrapper[4687]: I0228 09:32:10.887587 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:10 crc kubenswrapper[4687]: I0228 09:32:10.982660 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-ssh-key-openstack-edpm-ipam\") pod \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " Feb 28 09:32:10 crc kubenswrapper[4687]: I0228 09:32:10.982734 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-inventory\") pod \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " Feb 28 09:32:10 crc kubenswrapper[4687]: I0228 09:32:10.982782 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9wwm\" (UniqueName: \"kubernetes.io/projected/2bb3057f-10bb-43e9-af01-41131c5b6fb1-kube-api-access-w9wwm\") pod \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\" (UID: \"2bb3057f-10bb-43e9-af01-41131c5b6fb1\") " Feb 28 09:32:10 crc kubenswrapper[4687]: I0228 09:32:10.990940 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb3057f-10bb-43e9-af01-41131c5b6fb1-kube-api-access-w9wwm" (OuterVolumeSpecName: "kube-api-access-w9wwm") pod "2bb3057f-10bb-43e9-af01-41131c5b6fb1" (UID: "2bb3057f-10bb-43e9-af01-41131c5b6fb1"). InnerVolumeSpecName "kube-api-access-w9wwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.005173 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2bb3057f-10bb-43e9-af01-41131c5b6fb1" (UID: "2bb3057f-10bb-43e9-af01-41131c5b6fb1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.005846 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-inventory" (OuterVolumeSpecName: "inventory") pod "2bb3057f-10bb-43e9-af01-41131c5b6fb1" (UID: "2bb3057f-10bb-43e9-af01-41131c5b6fb1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.086559 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.086608 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bb3057f-10bb-43e9-af01-41131c5b6fb1-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.086620 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9wwm\" (UniqueName: \"kubernetes.io/projected/2bb3057f-10bb-43e9-af01-41131c5b6fb1-kube-api-access-w9wwm\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.554141 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" event={"ID":"2bb3057f-10bb-43e9-af01-41131c5b6fb1","Type":"ContainerDied","Data":"e97f041ff4904503be2fd108c4d5edfa329e06492b7e949bcb2e92e8020ec766"} Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.554496 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e97f041ff4904503be2fd108c4d5edfa329e06492b7e949bcb2e92e8020ec766" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.554174 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.633936 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k"] Feb 28 09:32:11 crc kubenswrapper[4687]: E0228 09:32:11.644888 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ee491-1300-475e-9410-521ebb3f5078" containerName="oc" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.644926 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ee491-1300-475e-9410-521ebb3f5078" containerName="oc" Feb 28 09:32:11 crc kubenswrapper[4687]: E0228 09:32:11.644966 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb3057f-10bb-43e9-af01-41131c5b6fb1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.644974 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb3057f-10bb-43e9-af01-41131c5b6fb1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.645514 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="958ee491-1300-475e-9410-521ebb3f5078" containerName="oc" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.645568 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb3057f-10bb-43e9-af01-41131c5b6fb1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.646716 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.655975 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.656137 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.656214 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.656301 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.656460 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.656610 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.658661 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.658967 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.660275 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k"] Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.700892 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701037 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701137 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2b8\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-kube-api-access-4h2b8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701334 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701416 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701627 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701809 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701873 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701938 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.701981 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.702074 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.702134 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.702166 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.702266 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.804915 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.805454 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.805594 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.805779 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.805878 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.805976 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.806083 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.806171 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.807066 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.807345 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.807511 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.807705 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2b8\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-kube-api-access-4h2b8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.807899 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.808250 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.810300 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.810396 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.810462 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.810947 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.811783 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.812244 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.812339 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.812949 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.813151 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.813324 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.814064 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.814072 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.815068 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.825695 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2b8\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-kube-api-access-4h2b8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:11 crc kubenswrapper[4687]: I0228 09:32:11.969156 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:12 crc kubenswrapper[4687]: I0228 09:32:12.437226 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k"] Feb 28 09:32:12 crc kubenswrapper[4687]: I0228 09:32:12.566238 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" event={"ID":"d966dc9f-36d1-4236-8839-0f9794c0e663","Type":"ContainerStarted","Data":"356b187ecf272e30ead6d14061608acdce5634329853e4c209242f079adcd741"} Feb 28 09:32:13 crc kubenswrapper[4687]: I0228 09:32:13.576972 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" event={"ID":"d966dc9f-36d1-4236-8839-0f9794c0e663","Type":"ContainerStarted","Data":"9c1f86eb2337c0d76e26b20eccac3d455aebd31eb0ed0f42bdc8d312989531f1"} Feb 28 09:32:13 crc kubenswrapper[4687]: I0228 09:32:13.602395 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" podStartSLOduration=2.007659997 podStartE2EDuration="2.602382549s" podCreationTimestamp="2026-02-28 09:32:11 +0000 UTC" firstStartedPulling="2026-02-28 09:32:12.443565989 +0000 UTC m=+1724.134135325" lastFinishedPulling="2026-02-28 09:32:13.038288541 +0000 UTC m=+1724.728857877" observedRunningTime="2026-02-28 09:32:13.599233849 +0000 UTC m=+1725.289803186" watchObservedRunningTime="2026-02-28 09:32:13.602382549 +0000 UTC m=+1725.292951886" Feb 28 09:32:14 crc kubenswrapper[4687]: I0228 09:32:14.656640 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:32:14 crc kubenswrapper[4687]: E0228 09:32:14.657221 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:32:18 crc kubenswrapper[4687]: I0228 09:32:18.026288 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vw96t"] Feb 28 09:32:18 crc kubenswrapper[4687]: I0228 09:32:18.033090 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vw96t"] Feb 28 09:32:18 crc kubenswrapper[4687]: I0228 09:32:18.665908 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7" path="/var/lib/kubelet/pods/1dc1b23e-45ba-4bf1-9c7f-66b6218fbae7/volumes" Feb 28 09:32:27 crc kubenswrapper[4687]: I0228 09:32:27.657126 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:32:27 crc kubenswrapper[4687]: E0228 09:32:27.657897 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:32:39 crc kubenswrapper[4687]: I0228 09:32:39.758632 4687 generic.go:334] "Generic (PLEG): container finished" podID="d966dc9f-36d1-4236-8839-0f9794c0e663" containerID="9c1f86eb2337c0d76e26b20eccac3d455aebd31eb0ed0f42bdc8d312989531f1" exitCode=0 Feb 28 09:32:39 crc kubenswrapper[4687]: I0228 09:32:39.758720 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" event={"ID":"d966dc9f-36d1-4236-8839-0f9794c0e663","Type":"ContainerDied","Data":"9c1f86eb2337c0d76e26b20eccac3d455aebd31eb0ed0f42bdc8d312989531f1"} Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.076819 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127724 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-inventory\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127761 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-libvirt-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127801 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-bootstrap-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127852 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127889 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-neutron-metadata-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127938 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-nova-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127971 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-telemetry-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.127988 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.128036 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ovn-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.128062 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.128095 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h2b8\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-kube-api-access-4h2b8\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.128114 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ssh-key-openstack-edpm-ipam\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.128142 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.128200 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-repo-setup-combined-ca-bundle\") pod \"d966dc9f-36d1-4236-8839-0f9794c0e663\" (UID: \"d966dc9f-36d1-4236-8839-0f9794c0e663\") " Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.132686 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.132950 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.134242 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.134270 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.134442 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.134462 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.134735 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.134995 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.135159 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-kube-api-access-4h2b8" (OuterVolumeSpecName: "kube-api-access-4h2b8") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "kube-api-access-4h2b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.135168 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.137377 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.149502 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.152749 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.154327 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-inventory" (OuterVolumeSpecName: "inventory") pod "d966dc9f-36d1-4236-8839-0f9794c0e663" (UID: "d966dc9f-36d1-4236-8839-0f9794c0e663"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230632 4687 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230659 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230670 4687 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230680 4687 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230690 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230700 4687 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230710 4687 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230718 4687 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230726 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230736 4687 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230744 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230752 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h2b8\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-kube-api-access-4h2b8\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230761 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d966dc9f-36d1-4236-8839-0f9794c0e663-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.230769 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d966dc9f-36d1-4236-8839-0f9794c0e663-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.657744 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:32:41 crc kubenswrapper[4687]: E0228 09:32:41.658120 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.772407 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" event={"ID":"d966dc9f-36d1-4236-8839-0f9794c0e663","Type":"ContainerDied","Data":"356b187ecf272e30ead6d14061608acdce5634329853e4c209242f079adcd741"} Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.772443 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356b187ecf272e30ead6d14061608acdce5634329853e4c209242f079adcd741" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.772455 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.840428 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4"] Feb 28 09:32:41 crc kubenswrapper[4687]: E0228 09:32:41.840794 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d966dc9f-36d1-4236-8839-0f9794c0e663" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.840814 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d966dc9f-36d1-4236-8839-0f9794c0e663" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.841045 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d966dc9f-36d1-4236-8839-0f9794c0e663" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.841705 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.843586 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.846165 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.846209 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.846636 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.846724 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.850617 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4"] Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.943923 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.944045 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.944195 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.944336 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkcdc\" (UniqueName: \"kubernetes.io/projected/c1151261-c776-4190-ad84-46a4a3c68a6a-kube-api-access-lkcdc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:41 crc kubenswrapper[4687]: I0228 09:32:41.944378 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1151261-c776-4190-ad84-46a4a3c68a6a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.045996 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkcdc\" (UniqueName: \"kubernetes.io/projected/c1151261-c776-4190-ad84-46a4a3c68a6a-kube-api-access-lkcdc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.046261 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1151261-c776-4190-ad84-46a4a3c68a6a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.046341 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.046433 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.046555 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.047375 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1151261-c776-4190-ad84-46a4a3c68a6a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.050246 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.050382 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.050455 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.060194 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkcdc\" (UniqueName: \"kubernetes.io/projected/c1151261-c776-4190-ad84-46a4a3c68a6a-kube-api-access-lkcdc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hcsz4\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.158228 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.592125 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4"] Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.595625 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.780339 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" event={"ID":"c1151261-c776-4190-ad84-46a4a3c68a6a","Type":"ContainerStarted","Data":"2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144"} Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.901578 4687 scope.go:117] "RemoveContainer" containerID="799ff8485be9db4864a4e0a9ee0295a8bb1f7cfff3160ec45b8e4b43cb6ec244" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.933278 4687 scope.go:117] "RemoveContainer" containerID="fe67e581c0171a0d36c34a7aaf9063b4b6785236125470ebd698a17fb0341b72" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.957733 4687 scope.go:117] "RemoveContainer" containerID="71a5cc66932b3a81fc5c97753c47d9b6ae7801ae91551b9f6c1b5f925bc09223" Feb 28 09:32:42 crc kubenswrapper[4687]: I0228 09:32:42.988925 4687 scope.go:117] "RemoveContainer" containerID="df80ab0853ae48c8725369755041e05ead28d3cc7e78bd001524e8958836fba5" Feb 28 09:32:43 crc kubenswrapper[4687]: I0228 09:32:43.788731 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" event={"ID":"c1151261-c776-4190-ad84-46a4a3c68a6a","Type":"ContainerStarted","Data":"d0f76a0130cbbfe37ddf2ae0905dd60073d728fff624c0a2e06ef446c0728362"} Feb 28 09:32:43 crc kubenswrapper[4687]: I0228 09:32:43.803622 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" podStartSLOduration=2.302036153 podStartE2EDuration="2.803607385s" podCreationTimestamp="2026-02-28 09:32:41 +0000 UTC" firstStartedPulling="2026-02-28 09:32:42.594986249 +0000 UTC m=+1754.285555587" lastFinishedPulling="2026-02-28 09:32:43.096557482 +0000 UTC m=+1754.787126819" observedRunningTime="2026-02-28 09:32:43.80139177 +0000 UTC m=+1755.491961107" watchObservedRunningTime="2026-02-28 09:32:43.803607385 +0000 UTC m=+1755.494176723" Feb 28 09:32:56 crc kubenswrapper[4687]: I0228 09:32:56.656239 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:32:56 crc kubenswrapper[4687]: E0228 09:32:56.656800 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:33:11 crc kubenswrapper[4687]: I0228 09:33:11.656669 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:33:11 crc kubenswrapper[4687]: E0228 09:33:11.657359 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:33:22 crc kubenswrapper[4687]: I0228 09:33:22.657386 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:33:22 crc kubenswrapper[4687]: E0228 09:33:22.658071 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:33:27 crc kubenswrapper[4687]: I0228 09:33:27.082092 4687 generic.go:334] "Generic (PLEG): container finished" podID="c1151261-c776-4190-ad84-46a4a3c68a6a" containerID="d0f76a0130cbbfe37ddf2ae0905dd60073d728fff624c0a2e06ef446c0728362" exitCode=0 Feb 28 09:33:27 crc kubenswrapper[4687]: I0228 09:33:27.082186 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" event={"ID":"c1151261-c776-4190-ad84-46a4a3c68a6a","Type":"ContainerDied","Data":"d0f76a0130cbbfe37ddf2ae0905dd60073d728fff624c0a2e06ef446c0728362"} Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.433745 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.584389 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkcdc\" (UniqueName: \"kubernetes.io/projected/c1151261-c776-4190-ad84-46a4a3c68a6a-kube-api-access-lkcdc\") pod \"c1151261-c776-4190-ad84-46a4a3c68a6a\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.584934 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-inventory\") pod \"c1151261-c776-4190-ad84-46a4a3c68a6a\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.585038 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1151261-c776-4190-ad84-46a4a3c68a6a-ovncontroller-config-0\") pod \"c1151261-c776-4190-ad84-46a4a3c68a6a\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.585078 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ssh-key-openstack-edpm-ipam\") pod \"c1151261-c776-4190-ad84-46a4a3c68a6a\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.585161 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ovn-combined-ca-bundle\") pod \"c1151261-c776-4190-ad84-46a4a3c68a6a\" (UID: \"c1151261-c776-4190-ad84-46a4a3c68a6a\") " Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.591303 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c1151261-c776-4190-ad84-46a4a3c68a6a" (UID: "c1151261-c776-4190-ad84-46a4a3c68a6a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.591497 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1151261-c776-4190-ad84-46a4a3c68a6a-kube-api-access-lkcdc" (OuterVolumeSpecName: "kube-api-access-lkcdc") pod "c1151261-c776-4190-ad84-46a4a3c68a6a" (UID: "c1151261-c776-4190-ad84-46a4a3c68a6a"). InnerVolumeSpecName "kube-api-access-lkcdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.607935 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1151261-c776-4190-ad84-46a4a3c68a6a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c1151261-c776-4190-ad84-46a4a3c68a6a" (UID: "c1151261-c776-4190-ad84-46a4a3c68a6a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.609989 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-inventory" (OuterVolumeSpecName: "inventory") pod "c1151261-c776-4190-ad84-46a4a3c68a6a" (UID: "c1151261-c776-4190-ad84-46a4a3c68a6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.610697 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c1151261-c776-4190-ad84-46a4a3c68a6a" (UID: "c1151261-c776-4190-ad84-46a4a3c68a6a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.688862 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.688923 4687 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.688936 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkcdc\" (UniqueName: \"kubernetes.io/projected/c1151261-c776-4190-ad84-46a4a3c68a6a-kube-api-access-lkcdc\") on node \"crc\" DevicePath \"\"" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.688950 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1151261-c776-4190-ad84-46a4a3c68a6a-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:33:28 crc kubenswrapper[4687]: I0228 09:33:28.688962 4687 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1151261-c776-4190-ad84-46a4a3c68a6a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.101379 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" event={"ID":"c1151261-c776-4190-ad84-46a4a3c68a6a","Type":"ContainerDied","Data":"2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144"} Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.101747 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.101486 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hcsz4" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.193220 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8"] Feb 28 09:33:29 crc kubenswrapper[4687]: E0228 09:33:29.193697 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1151261-c776-4190-ad84-46a4a3c68a6a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.193715 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1151261-c776-4190-ad84-46a4a3c68a6a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.193900 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1151261-c776-4190-ad84-46a4a3c68a6a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.194578 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.196982 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.198641 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.198805 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.200723 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.200835 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.201074 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.202060 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8"] Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.303869 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.303921 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.303973 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.303996 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.304147 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.304246 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8l6d\" (UniqueName: \"kubernetes.io/projected/29b1d03b-8788-4d8d-8105-700b9cfe905a-kube-api-access-r8l6d\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.405549 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.405596 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.405628 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.405649 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.405737 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.405814 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8l6d\" (UniqueName: \"kubernetes.io/projected/29b1d03b-8788-4d8d-8105-700b9cfe905a-kube-api-access-r8l6d\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.408873 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.408944 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.409261 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.409869 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.410573 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.423304 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8l6d\" (UniqueName: \"kubernetes.io/projected/29b1d03b-8788-4d8d-8105-700b9cfe905a-kube-api-access-r8l6d\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.520938 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:33:29 crc kubenswrapper[4687]: W0228 09:33:29.873846 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29b1d03b_8788_4d8d_8105_700b9cfe905a.slice/crio-1a5e575e575f3ea670d30739a9d81dd2cc02f08328836480caca924b8ee3da9c WatchSource:0}: Error finding container 1a5e575e575f3ea670d30739a9d81dd2cc02f08328836480caca924b8ee3da9c: Status 404 returned error can't find the container with id 1a5e575e575f3ea670d30739a9d81dd2cc02f08328836480caca924b8ee3da9c Feb 28 09:33:29 crc kubenswrapper[4687]: I0228 09:33:29.874621 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8"] Feb 28 09:33:30 crc kubenswrapper[4687]: I0228 09:33:30.109771 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" event={"ID":"29b1d03b-8788-4d8d-8105-700b9cfe905a","Type":"ContainerStarted","Data":"1a5e575e575f3ea670d30739a9d81dd2cc02f08328836480caca924b8ee3da9c"} Feb 28 09:33:30 crc kubenswrapper[4687]: E0228 09:33:30.962889 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice/crio-2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice\": RecentStats: unable to find data in memory cache]" Feb 28 09:33:31 crc kubenswrapper[4687]: I0228 09:33:31.118153 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" event={"ID":"29b1d03b-8788-4d8d-8105-700b9cfe905a","Type":"ContainerStarted","Data":"34f337dc284af026a0ed6bcb640805ad1e80373580505f0bc39e40dc66db532f"} Feb 28 09:33:31 crc kubenswrapper[4687]: I0228 09:33:31.140078 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" podStartSLOduration=1.498456724 podStartE2EDuration="2.140060375s" podCreationTimestamp="2026-02-28 09:33:29 +0000 UTC" firstStartedPulling="2026-02-28 09:33:29.8761068 +0000 UTC m=+1801.566676137" lastFinishedPulling="2026-02-28 09:33:30.517710451 +0000 UTC m=+1802.208279788" observedRunningTime="2026-02-28 09:33:31.133783975 +0000 UTC m=+1802.824353312" watchObservedRunningTime="2026-02-28 09:33:31.140060375 +0000 UTC m=+1802.830629712" Feb 28 09:33:34 crc kubenswrapper[4687]: I0228 09:33:34.658069 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:33:34 crc kubenswrapper[4687]: E0228 09:33:34.658750 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:33:41 crc kubenswrapper[4687]: E0228 09:33:41.160827 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice/crio-2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144\": RecentStats: unable to find data in memory cache]" Feb 28 09:33:48 crc kubenswrapper[4687]: I0228 09:33:48.672433 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:33:48 crc kubenswrapper[4687]: E0228 09:33:48.673255 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:33:51 crc kubenswrapper[4687]: E0228 09:33:51.344868 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice/crio-2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice\": RecentStats: unable to find data in memory cache]" Feb 28 09:33:59 crc kubenswrapper[4687]: I0228 09:33:59.657481 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.138650 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537854-xj6v7"] Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.140188 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.141431 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.142150 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.142390 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.145605 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537854-xj6v7"] Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.255992 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7tq\" (UniqueName: \"kubernetes.io/projected/716080da-ec3b-4498-b033-a048e7ca9d11-kube-api-access-5m7tq\") pod \"auto-csr-approver-29537854-xj6v7\" (UID: \"716080da-ec3b-4498-b033-a048e7ca9d11\") " pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.317055 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"dae4760c42bdf35ff81f24568deadc7a5d5f1d56cf50f222534d7b17be296984"} Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.359136 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7tq\" (UniqueName: \"kubernetes.io/projected/716080da-ec3b-4498-b033-a048e7ca9d11-kube-api-access-5m7tq\") pod \"auto-csr-approver-29537854-xj6v7\" (UID: \"716080da-ec3b-4498-b033-a048e7ca9d11\") " pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.374584 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7tq\" (UniqueName: \"kubernetes.io/projected/716080da-ec3b-4498-b033-a048e7ca9d11-kube-api-access-5m7tq\") pod \"auto-csr-approver-29537854-xj6v7\" (UID: \"716080da-ec3b-4498-b033-a048e7ca9d11\") " pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.459146 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:00 crc kubenswrapper[4687]: I0228 09:34:00.831768 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537854-xj6v7"] Feb 28 09:34:00 crc kubenswrapper[4687]: W0228 09:34:00.832222 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod716080da_ec3b_4498_b033_a048e7ca9d11.slice/crio-d0f50753223bb5c89263901b3c26992c4f80c9903b19e55bea8b4a3111e1ec00 WatchSource:0}: Error finding container d0f50753223bb5c89263901b3c26992c4f80c9903b19e55bea8b4a3111e1ec00: Status 404 returned error can't find the container with id d0f50753223bb5c89263901b3c26992c4f80c9903b19e55bea8b4a3111e1ec00 Feb 28 09:34:01 crc kubenswrapper[4687]: I0228 09:34:01.326440 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" event={"ID":"716080da-ec3b-4498-b033-a048e7ca9d11","Type":"ContainerStarted","Data":"d0f50753223bb5c89263901b3c26992c4f80c9903b19e55bea8b4a3111e1ec00"} Feb 28 09:34:01 crc kubenswrapper[4687]: E0228 09:34:01.554068 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice/crio-2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144\": RecentStats: unable to find data in memory cache]" Feb 28 09:34:02 crc kubenswrapper[4687]: I0228 09:34:02.337703 4687 generic.go:334] "Generic (PLEG): container finished" podID="716080da-ec3b-4498-b033-a048e7ca9d11" containerID="fdb258840516d4030e15d02445e35edb31906d47bf058002d6dfe8d312c52131" exitCode=0 Feb 28 09:34:02 crc kubenswrapper[4687]: I0228 09:34:02.337799 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" event={"ID":"716080da-ec3b-4498-b033-a048e7ca9d11","Type":"ContainerDied","Data":"fdb258840516d4030e15d02445e35edb31906d47bf058002d6dfe8d312c52131"} Feb 28 09:34:03 crc kubenswrapper[4687]: I0228 09:34:03.620068 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:03 crc kubenswrapper[4687]: I0228 09:34:03.821246 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m7tq\" (UniqueName: \"kubernetes.io/projected/716080da-ec3b-4498-b033-a048e7ca9d11-kube-api-access-5m7tq\") pod \"716080da-ec3b-4498-b033-a048e7ca9d11\" (UID: \"716080da-ec3b-4498-b033-a048e7ca9d11\") " Feb 28 09:34:03 crc kubenswrapper[4687]: I0228 09:34:03.826588 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716080da-ec3b-4498-b033-a048e7ca9d11-kube-api-access-5m7tq" (OuterVolumeSpecName: "kube-api-access-5m7tq") pod "716080da-ec3b-4498-b033-a048e7ca9d11" (UID: "716080da-ec3b-4498-b033-a048e7ca9d11"). InnerVolumeSpecName "kube-api-access-5m7tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:34:03 crc kubenswrapper[4687]: I0228 09:34:03.923355 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m7tq\" (UniqueName: \"kubernetes.io/projected/716080da-ec3b-4498-b033-a048e7ca9d11-kube-api-access-5m7tq\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:04 crc kubenswrapper[4687]: I0228 09:34:04.351786 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" event={"ID":"716080da-ec3b-4498-b033-a048e7ca9d11","Type":"ContainerDied","Data":"d0f50753223bb5c89263901b3c26992c4f80c9903b19e55bea8b4a3111e1ec00"} Feb 28 09:34:04 crc kubenswrapper[4687]: I0228 09:34:04.352241 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0f50753223bb5c89263901b3c26992c4f80c9903b19e55bea8b4a3111e1ec00" Feb 28 09:34:04 crc kubenswrapper[4687]: I0228 09:34:04.351848 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537854-xj6v7" Feb 28 09:34:04 crc kubenswrapper[4687]: I0228 09:34:04.672353 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537848-g7jxk"] Feb 28 09:34:04 crc kubenswrapper[4687]: I0228 09:34:04.677614 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537848-g7jxk"] Feb 28 09:34:05 crc kubenswrapper[4687]: I0228 09:34:05.360603 4687 generic.go:334] "Generic (PLEG): container finished" podID="29b1d03b-8788-4d8d-8105-700b9cfe905a" containerID="34f337dc284af026a0ed6bcb640805ad1e80373580505f0bc39e40dc66db532f" exitCode=0 Feb 28 09:34:05 crc kubenswrapper[4687]: I0228 09:34:05.360655 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" event={"ID":"29b1d03b-8788-4d8d-8105-700b9cfe905a","Type":"ContainerDied","Data":"34f337dc284af026a0ed6bcb640805ad1e80373580505f0bc39e40dc66db532f"} Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.665423 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d741a584-384a-4d5a-bf8e-07e2603f0af0" path="/var/lib/kubelet/pods/d741a584-384a-4d5a-bf8e-07e2603f0af0/volumes" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.685078 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.869778 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"29b1d03b-8788-4d8d-8105-700b9cfe905a\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.869832 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8l6d\" (UniqueName: \"kubernetes.io/projected/29b1d03b-8788-4d8d-8105-700b9cfe905a-kube-api-access-r8l6d\") pod \"29b1d03b-8788-4d8d-8105-700b9cfe905a\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.869997 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-inventory\") pod \"29b1d03b-8788-4d8d-8105-700b9cfe905a\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.870097 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-metadata-combined-ca-bundle\") pod \"29b1d03b-8788-4d8d-8105-700b9cfe905a\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.870245 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-ssh-key-openstack-edpm-ipam\") pod \"29b1d03b-8788-4d8d-8105-700b9cfe905a\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.870283 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-nova-metadata-neutron-config-0\") pod \"29b1d03b-8788-4d8d-8105-700b9cfe905a\" (UID: \"29b1d03b-8788-4d8d-8105-700b9cfe905a\") " Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.874992 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "29b1d03b-8788-4d8d-8105-700b9cfe905a" (UID: "29b1d03b-8788-4d8d-8105-700b9cfe905a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.875201 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b1d03b-8788-4d8d-8105-700b9cfe905a-kube-api-access-r8l6d" (OuterVolumeSpecName: "kube-api-access-r8l6d") pod "29b1d03b-8788-4d8d-8105-700b9cfe905a" (UID: "29b1d03b-8788-4d8d-8105-700b9cfe905a"). InnerVolumeSpecName "kube-api-access-r8l6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.892620 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "29b1d03b-8788-4d8d-8105-700b9cfe905a" (UID: "29b1d03b-8788-4d8d-8105-700b9cfe905a"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.892644 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-inventory" (OuterVolumeSpecName: "inventory") pod "29b1d03b-8788-4d8d-8105-700b9cfe905a" (UID: "29b1d03b-8788-4d8d-8105-700b9cfe905a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.892946 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29b1d03b-8788-4d8d-8105-700b9cfe905a" (UID: "29b1d03b-8788-4d8d-8105-700b9cfe905a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.893171 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "29b1d03b-8788-4d8d-8105-700b9cfe905a" (UID: "29b1d03b-8788-4d8d-8105-700b9cfe905a"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.972506 4687 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.972539 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.972549 4687 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.972560 4687 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.972572 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8l6d\" (UniqueName: \"kubernetes.io/projected/29b1d03b-8788-4d8d-8105-700b9cfe905a-kube-api-access-r8l6d\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:06 crc kubenswrapper[4687]: I0228 09:34:06.972580 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b1d03b-8788-4d8d-8105-700b9cfe905a-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.374764 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" event={"ID":"29b1d03b-8788-4d8d-8105-700b9cfe905a","Type":"ContainerDied","Data":"1a5e575e575f3ea670d30739a9d81dd2cc02f08328836480caca924b8ee3da9c"} Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.374804 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5e575e575f3ea670d30739a9d81dd2cc02f08328836480caca924b8ee3da9c" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.374805 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.432928 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9"] Feb 28 09:34:07 crc kubenswrapper[4687]: E0228 09:34:07.433307 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716080da-ec3b-4498-b033-a048e7ca9d11" containerName="oc" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.433324 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="716080da-ec3b-4498-b033-a048e7ca9d11" containerName="oc" Feb 28 09:34:07 crc kubenswrapper[4687]: E0228 09:34:07.433359 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b1d03b-8788-4d8d-8105-700b9cfe905a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.433366 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b1d03b-8788-4d8d-8105-700b9cfe905a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.433503 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b1d03b-8788-4d8d-8105-700b9cfe905a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.433524 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="716080da-ec3b-4498-b033-a048e7ca9d11" containerName="oc" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.434063 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.435745 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.436115 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.436911 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.437043 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.437314 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.441049 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9"] Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.479690 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.479791 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.479942 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.480051 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.480221 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzjp5\" (UniqueName: \"kubernetes.io/projected/6fb2570c-4ba8-41f6-83a3-038b8ab54177-kube-api-access-zzjp5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.582128 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.582233 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.582404 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzjp5\" (UniqueName: \"kubernetes.io/projected/6fb2570c-4ba8-41f6-83a3-038b8ab54177-kube-api-access-zzjp5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.582571 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.582608 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.594815 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.596519 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.599513 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.613459 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.617694 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzjp5\" (UniqueName: \"kubernetes.io/projected/6fb2570c-4ba8-41f6-83a3-038b8ab54177-kube-api-access-zzjp5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:07 crc kubenswrapper[4687]: I0228 09:34:07.745842 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:34:08 crc kubenswrapper[4687]: I0228 09:34:08.170391 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9"] Feb 28 09:34:08 crc kubenswrapper[4687]: I0228 09:34:08.382879 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" event={"ID":"6fb2570c-4ba8-41f6-83a3-038b8ab54177","Type":"ContainerStarted","Data":"c13b7151c33e81c4122aa4e1f6779cb68b675b630d9eb22db9b3529915667deb"} Feb 28 09:34:09 crc kubenswrapper[4687]: I0228 09:34:09.390128 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" event={"ID":"6fb2570c-4ba8-41f6-83a3-038b8ab54177","Type":"ContainerStarted","Data":"761c9022302299ee0a45296c77d0a50f37477ba60c0397fab9d3d437bfdb0837"} Feb 28 09:34:09 crc kubenswrapper[4687]: I0228 09:34:09.427167 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" podStartSLOduration=1.948448925 podStartE2EDuration="2.427152064s" podCreationTimestamp="2026-02-28 09:34:07 +0000 UTC" firstStartedPulling="2026-02-28 09:34:08.172945895 +0000 UTC m=+1839.863515232" lastFinishedPulling="2026-02-28 09:34:08.651649034 +0000 UTC m=+1840.342218371" observedRunningTime="2026-02-28 09:34:09.420373539 +0000 UTC m=+1841.110942877" watchObservedRunningTime="2026-02-28 09:34:09.427152064 +0000 UTC m=+1841.117721400" Feb 28 09:34:11 crc kubenswrapper[4687]: E0228 09:34:11.748424 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice/crio-2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144\": RecentStats: unable to find data in memory cache]" Feb 28 09:34:21 crc kubenswrapper[4687]: E0228 09:34:21.936793 4687 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice/crio-2bd9f6473db5bb3aed6624368fbc2989462ac2fa410831f046ddf0f969dbc144\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1151261_c776_4190_ad84_46a4a3c68a6a.slice\": RecentStats: unable to find data in memory cache]" Feb 28 09:34:43 crc kubenswrapper[4687]: I0228 09:34:43.081285 4687 scope.go:117] "RemoveContainer" containerID="dc90b95f5294f7a0c35cef8d5d8a70312b210c3e425f3907d089df85c9dbee95" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.132933 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537856-gvn4m"] Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.135918 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.138112 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.138213 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.138513 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.141463 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537856-gvn4m"] Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.278454 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nklkb\" (UniqueName: \"kubernetes.io/projected/5278aa8f-5ca9-4e1c-b485-9e771d15c63d-kube-api-access-nklkb\") pod \"auto-csr-approver-29537856-gvn4m\" (UID: \"5278aa8f-5ca9-4e1c-b485-9e771d15c63d\") " pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.380098 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nklkb\" (UniqueName: \"kubernetes.io/projected/5278aa8f-5ca9-4e1c-b485-9e771d15c63d-kube-api-access-nklkb\") pod \"auto-csr-approver-29537856-gvn4m\" (UID: \"5278aa8f-5ca9-4e1c-b485-9e771d15c63d\") " pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.397713 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nklkb\" (UniqueName: \"kubernetes.io/projected/5278aa8f-5ca9-4e1c-b485-9e771d15c63d-kube-api-access-nklkb\") pod \"auto-csr-approver-29537856-gvn4m\" (UID: \"5278aa8f-5ca9-4e1c-b485-9e771d15c63d\") " pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.455611 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:00 crc kubenswrapper[4687]: I0228 09:36:00.831148 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537856-gvn4m"] Feb 28 09:36:01 crc kubenswrapper[4687]: I0228 09:36:01.220253 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" event={"ID":"5278aa8f-5ca9-4e1c-b485-9e771d15c63d","Type":"ContainerStarted","Data":"69a69f5520e33ee661dae72b3992958a4a51bf2503d544b459663f350e8ac36e"} Feb 28 09:36:02 crc kubenswrapper[4687]: I0228 09:36:02.228516 4687 generic.go:334] "Generic (PLEG): container finished" podID="5278aa8f-5ca9-4e1c-b485-9e771d15c63d" containerID="c556349e72ced23123967647159678349d39af8ed507304146ffc55a17f4f35a" exitCode=0 Feb 28 09:36:02 crc kubenswrapper[4687]: I0228 09:36:02.228571 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" event={"ID":"5278aa8f-5ca9-4e1c-b485-9e771d15c63d","Type":"ContainerDied","Data":"c556349e72ced23123967647159678349d39af8ed507304146ffc55a17f4f35a"} Feb 28 09:36:03 crc kubenswrapper[4687]: I0228 09:36:03.494194 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:03 crc kubenswrapper[4687]: I0228 09:36:03.540497 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nklkb\" (UniqueName: \"kubernetes.io/projected/5278aa8f-5ca9-4e1c-b485-9e771d15c63d-kube-api-access-nklkb\") pod \"5278aa8f-5ca9-4e1c-b485-9e771d15c63d\" (UID: \"5278aa8f-5ca9-4e1c-b485-9e771d15c63d\") " Feb 28 09:36:03 crc kubenswrapper[4687]: I0228 09:36:03.546267 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5278aa8f-5ca9-4e1c-b485-9e771d15c63d-kube-api-access-nklkb" (OuterVolumeSpecName: "kube-api-access-nklkb") pod "5278aa8f-5ca9-4e1c-b485-9e771d15c63d" (UID: "5278aa8f-5ca9-4e1c-b485-9e771d15c63d"). InnerVolumeSpecName "kube-api-access-nklkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:36:03 crc kubenswrapper[4687]: I0228 09:36:03.642623 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nklkb\" (UniqueName: \"kubernetes.io/projected/5278aa8f-5ca9-4e1c-b485-9e771d15c63d-kube-api-access-nklkb\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:04 crc kubenswrapper[4687]: I0228 09:36:04.249961 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" event={"ID":"5278aa8f-5ca9-4e1c-b485-9e771d15c63d","Type":"ContainerDied","Data":"69a69f5520e33ee661dae72b3992958a4a51bf2503d544b459663f350e8ac36e"} Feb 28 09:36:04 crc kubenswrapper[4687]: I0228 09:36:04.250011 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69a69f5520e33ee661dae72b3992958a4a51bf2503d544b459663f350e8ac36e" Feb 28 09:36:04 crc kubenswrapper[4687]: I0228 09:36:04.250078 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537856-gvn4m" Feb 28 09:36:04 crc kubenswrapper[4687]: I0228 09:36:04.560548 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537850-sgh2x"] Feb 28 09:36:04 crc kubenswrapper[4687]: I0228 09:36:04.566359 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537850-sgh2x"] Feb 28 09:36:04 crc kubenswrapper[4687]: I0228 09:36:04.666496 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d983fd-590f-4ca9-8de5-361bc4f3a6f2" path="/var/lib/kubelet/pods/e9d983fd-590f-4ca9-8de5-361bc4f3a6f2/volumes" Feb 28 09:36:25 crc kubenswrapper[4687]: I0228 09:36:25.003651 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:36:25 crc kubenswrapper[4687]: I0228 09:36:25.005137 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.526441 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pdkp6"] Feb 28 09:36:33 crc kubenswrapper[4687]: E0228 09:36:33.527696 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5278aa8f-5ca9-4e1c-b485-9e771d15c63d" containerName="oc" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.527715 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="5278aa8f-5ca9-4e1c-b485-9e771d15c63d" containerName="oc" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.528064 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="5278aa8f-5ca9-4e1c-b485-9e771d15c63d" containerName="oc" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.529758 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.538252 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdkp6"] Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.617995 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-catalog-content\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.618087 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7c7\" (UniqueName: \"kubernetes.io/projected/66c7d2dc-b5e8-454c-92ea-57f7d2465681-kube-api-access-qf7c7\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.618168 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-utilities\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.719861 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-catalog-content\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.720121 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7c7\" (UniqueName: \"kubernetes.io/projected/66c7d2dc-b5e8-454c-92ea-57f7d2465681-kube-api-access-qf7c7\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.720510 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-utilities\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.720897 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-catalog-content\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.721958 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-utilities\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.738843 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7c7\" (UniqueName: \"kubernetes.io/projected/66c7d2dc-b5e8-454c-92ea-57f7d2465681-kube-api-access-qf7c7\") pod \"redhat-operators-pdkp6\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:33 crc kubenswrapper[4687]: I0228 09:36:33.849626 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:34 crc kubenswrapper[4687]: I0228 09:36:34.283233 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdkp6"] Feb 28 09:36:34 crc kubenswrapper[4687]: I0228 09:36:34.494636 4687 generic.go:334] "Generic (PLEG): container finished" podID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerID="ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe" exitCode=0 Feb 28 09:36:34 crc kubenswrapper[4687]: I0228 09:36:34.494690 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerDied","Data":"ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe"} Feb 28 09:36:34 crc kubenswrapper[4687]: I0228 09:36:34.494721 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerStarted","Data":"c25108b9231d746c05dcd17f35f3679b2c6a18ae09a4d75872075ed4d98d7795"} Feb 28 09:36:35 crc kubenswrapper[4687]: I0228 09:36:35.506081 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerStarted","Data":"504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60"} Feb 28 09:36:37 crc kubenswrapper[4687]: I0228 09:36:37.538597 4687 generic.go:334] "Generic (PLEG): container finished" podID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerID="504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60" exitCode=0 Feb 28 09:36:37 crc kubenswrapper[4687]: I0228 09:36:37.538669 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerDied","Data":"504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60"} Feb 28 09:36:38 crc kubenswrapper[4687]: I0228 09:36:38.549890 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerStarted","Data":"5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7"} Feb 28 09:36:38 crc kubenswrapper[4687]: I0228 09:36:38.573426 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pdkp6" podStartSLOduration=2.061862002 podStartE2EDuration="5.573406847s" podCreationTimestamp="2026-02-28 09:36:33 +0000 UTC" firstStartedPulling="2026-02-28 09:36:34.49660327 +0000 UTC m=+1986.187172607" lastFinishedPulling="2026-02-28 09:36:38.008148116 +0000 UTC m=+1989.698717452" observedRunningTime="2026-02-28 09:36:38.567829202 +0000 UTC m=+1990.258398549" watchObservedRunningTime="2026-02-28 09:36:38.573406847 +0000 UTC m=+1990.263976184" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.518145 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2dz6h"] Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.520111 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.529376 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dz6h"] Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.568727 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-utilities\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.568786 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-catalog-content\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.569058 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7glpx\" (UniqueName: \"kubernetes.io/projected/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-kube-api-access-7glpx\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.670500 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-utilities\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.670566 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-catalog-content\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.671297 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-catalog-content\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.671383 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7glpx\" (UniqueName: \"kubernetes.io/projected/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-kube-api-access-7glpx\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.671384 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-utilities\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.687635 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7glpx\" (UniqueName: \"kubernetes.io/projected/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-kube-api-access-7glpx\") pod \"certified-operators-2dz6h\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:41 crc kubenswrapper[4687]: I0228 09:36:41.834758 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:42 crc kubenswrapper[4687]: I0228 09:36:42.066201 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dz6h"] Feb 28 09:36:42 crc kubenswrapper[4687]: W0228 09:36:42.067134 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc06f8d8a_def7_4ddd_9c6d_626677a91dc3.slice/crio-75aa29bd768584a5f5f1a0144f9fdf1be9e97d343b81bf398230f90d45f2dfdc WatchSource:0}: Error finding container 75aa29bd768584a5f5f1a0144f9fdf1be9e97d343b81bf398230f90d45f2dfdc: Status 404 returned error can't find the container with id 75aa29bd768584a5f5f1a0144f9fdf1be9e97d343b81bf398230f90d45f2dfdc Feb 28 09:36:42 crc kubenswrapper[4687]: I0228 09:36:42.575795 4687 generic.go:334] "Generic (PLEG): container finished" podID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerID="bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec" exitCode=0 Feb 28 09:36:42 crc kubenswrapper[4687]: I0228 09:36:42.575894 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerDied","Data":"bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec"} Feb 28 09:36:42 crc kubenswrapper[4687]: I0228 09:36:42.576096 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerStarted","Data":"75aa29bd768584a5f5f1a0144f9fdf1be9e97d343b81bf398230f90d45f2dfdc"} Feb 28 09:36:43 crc kubenswrapper[4687]: I0228 09:36:43.164176 4687 scope.go:117] "RemoveContainer" containerID="4198b35ab426e3f30912756a63f22176d5af8a8651a16dc4c67516d7e15a6674" Feb 28 09:36:43 crc kubenswrapper[4687]: I0228 09:36:43.588259 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerStarted","Data":"78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8"} Feb 28 09:36:43 crc kubenswrapper[4687]: I0228 09:36:43.850368 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:43 crc kubenswrapper[4687]: I0228 09:36:43.850411 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:43 crc kubenswrapper[4687]: I0228 09:36:43.884008 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:44 crc kubenswrapper[4687]: I0228 09:36:44.596089 4687 generic.go:334] "Generic (PLEG): container finished" podID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerID="78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8" exitCode=0 Feb 28 09:36:44 crc kubenswrapper[4687]: I0228 09:36:44.596175 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerDied","Data":"78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8"} Feb 28 09:36:44 crc kubenswrapper[4687]: I0228 09:36:44.630281 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:45 crc kubenswrapper[4687]: I0228 09:36:45.605673 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerStarted","Data":"7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5"} Feb 28 09:36:45 crc kubenswrapper[4687]: I0228 09:36:45.624885 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2dz6h" podStartSLOduration=2.159106761 podStartE2EDuration="4.624871648s" podCreationTimestamp="2026-02-28 09:36:41 +0000 UTC" firstStartedPulling="2026-02-28 09:36:42.57744789 +0000 UTC m=+1994.268017227" lastFinishedPulling="2026-02-28 09:36:45.043212777 +0000 UTC m=+1996.733782114" observedRunningTime="2026-02-28 09:36:45.618008887 +0000 UTC m=+1997.308578224" watchObservedRunningTime="2026-02-28 09:36:45.624871648 +0000 UTC m=+1997.315440984" Feb 28 09:36:47 crc kubenswrapper[4687]: I0228 09:36:47.911251 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdkp6"] Feb 28 09:36:47 crc kubenswrapper[4687]: I0228 09:36:47.911672 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pdkp6" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="registry-server" containerID="cri-o://5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7" gracePeriod=2 Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.268365 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.269193 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-catalog-content\") pod \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.269357 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7c7\" (UniqueName: \"kubernetes.io/projected/66c7d2dc-b5e8-454c-92ea-57f7d2465681-kube-api-access-qf7c7\") pod \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.269427 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-utilities\") pod \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\" (UID: \"66c7d2dc-b5e8-454c-92ea-57f7d2465681\") " Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.270360 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-utilities" (OuterVolumeSpecName: "utilities") pod "66c7d2dc-b5e8-454c-92ea-57f7d2465681" (UID: "66c7d2dc-b5e8-454c-92ea-57f7d2465681"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.275128 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c7d2dc-b5e8-454c-92ea-57f7d2465681-kube-api-access-qf7c7" (OuterVolumeSpecName: "kube-api-access-qf7c7") pod "66c7d2dc-b5e8-454c-92ea-57f7d2465681" (UID: "66c7d2dc-b5e8-454c-92ea-57f7d2465681"). InnerVolumeSpecName "kube-api-access-qf7c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.364557 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66c7d2dc-b5e8-454c-92ea-57f7d2465681" (UID: "66c7d2dc-b5e8-454c-92ea-57f7d2465681"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.370475 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.370501 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf7c7\" (UniqueName: \"kubernetes.io/projected/66c7d2dc-b5e8-454c-92ea-57f7d2465681-kube-api-access-qf7c7\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.370512 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c7d2dc-b5e8-454c-92ea-57f7d2465681-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.631421 4687 generic.go:334] "Generic (PLEG): container finished" podID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerID="5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7" exitCode=0 Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.631488 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdkp6" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.631499 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerDied","Data":"5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7"} Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.631775 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdkp6" event={"ID":"66c7d2dc-b5e8-454c-92ea-57f7d2465681","Type":"ContainerDied","Data":"c25108b9231d746c05dcd17f35f3679b2c6a18ae09a4d75872075ed4d98d7795"} Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.631793 4687 scope.go:117] "RemoveContainer" containerID="5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.657192 4687 scope.go:117] "RemoveContainer" containerID="504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.663992 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdkp6"] Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.667430 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pdkp6"] Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.683096 4687 scope.go:117] "RemoveContainer" containerID="ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.705101 4687 scope.go:117] "RemoveContainer" containerID="5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7" Feb 28 09:36:48 crc kubenswrapper[4687]: E0228 09:36:48.705448 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7\": container with ID starting with 5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7 not found: ID does not exist" containerID="5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.705485 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7"} err="failed to get container status \"5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7\": rpc error: code = NotFound desc = could not find container \"5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7\": container with ID starting with 5c1619aa4f629f974b8f9f91811f3ed2ca5e76579d08cd904d5e6a1bed1fa9d7 not found: ID does not exist" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.705510 4687 scope.go:117] "RemoveContainer" containerID="504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60" Feb 28 09:36:48 crc kubenswrapper[4687]: E0228 09:36:48.705731 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60\": container with ID starting with 504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60 not found: ID does not exist" containerID="504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.705757 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60"} err="failed to get container status \"504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60\": rpc error: code = NotFound desc = could not find container \"504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60\": container with ID starting with 504f3c89ed95f063025c7dc577dd349a711792a3260f1de9e43080c11af2ef60 not found: ID does not exist" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.705774 4687 scope.go:117] "RemoveContainer" containerID="ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe" Feb 28 09:36:48 crc kubenswrapper[4687]: E0228 09:36:48.706014 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe\": container with ID starting with ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe not found: ID does not exist" containerID="ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe" Feb 28 09:36:48 crc kubenswrapper[4687]: I0228 09:36:48.706051 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe"} err="failed to get container status \"ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe\": rpc error: code = NotFound desc = could not find container \"ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe\": container with ID starting with ce32ab13e80fc9510c374176b4515d740478f238552fb8cd41dd50dfe6a3ccfe not found: ID does not exist" Feb 28 09:36:50 crc kubenswrapper[4687]: I0228 09:36:50.664805 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" path="/var/lib/kubelet/pods/66c7d2dc-b5e8-454c-92ea-57f7d2465681/volumes" Feb 28 09:36:51 crc kubenswrapper[4687]: I0228 09:36:51.835987 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:51 crc kubenswrapper[4687]: I0228 09:36:51.836067 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:51 crc kubenswrapper[4687]: I0228 09:36:51.871346 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:52 crc kubenswrapper[4687]: I0228 09:36:52.691748 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:52 crc kubenswrapper[4687]: I0228 09:36:52.727803 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dz6h"] Feb 28 09:36:54 crc kubenswrapper[4687]: I0228 09:36:54.673452 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2dz6h" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="registry-server" containerID="cri-o://7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5" gracePeriod=2 Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.002878 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.003235 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.052982 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.087720 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-utilities\") pod \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.087767 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7glpx\" (UniqueName: \"kubernetes.io/projected/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-kube-api-access-7glpx\") pod \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.087818 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-catalog-content\") pod \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\" (UID: \"c06f8d8a-def7-4ddd-9c6d-626677a91dc3\") " Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.088644 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-utilities" (OuterVolumeSpecName: "utilities") pod "c06f8d8a-def7-4ddd-9c6d-626677a91dc3" (UID: "c06f8d8a-def7-4ddd-9c6d-626677a91dc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.102085 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-kube-api-access-7glpx" (OuterVolumeSpecName: "kube-api-access-7glpx") pod "c06f8d8a-def7-4ddd-9c6d-626677a91dc3" (UID: "c06f8d8a-def7-4ddd-9c6d-626677a91dc3"). InnerVolumeSpecName "kube-api-access-7glpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.132344 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c06f8d8a-def7-4ddd-9c6d-626677a91dc3" (UID: "c06f8d8a-def7-4ddd-9c6d-626677a91dc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.189917 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.189943 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7glpx\" (UniqueName: \"kubernetes.io/projected/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-kube-api-access-7glpx\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.189954 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c06f8d8a-def7-4ddd-9c6d-626677a91dc3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.686320 4687 generic.go:334] "Generic (PLEG): container finished" podID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerID="7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5" exitCode=0 Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.686369 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerDied","Data":"7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5"} Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.686360 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dz6h" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.686405 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dz6h" event={"ID":"c06f8d8a-def7-4ddd-9c6d-626677a91dc3","Type":"ContainerDied","Data":"75aa29bd768584a5f5f1a0144f9fdf1be9e97d343b81bf398230f90d45f2dfdc"} Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.686423 4687 scope.go:117] "RemoveContainer" containerID="7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.701789 4687 scope.go:117] "RemoveContainer" containerID="78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.714135 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dz6h"] Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.720466 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2dz6h"] Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.733699 4687 scope.go:117] "RemoveContainer" containerID="bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.751643 4687 scope.go:117] "RemoveContainer" containerID="7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5" Feb 28 09:36:55 crc kubenswrapper[4687]: E0228 09:36:55.752073 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5\": container with ID starting with 7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5 not found: ID does not exist" containerID="7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.752114 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5"} err="failed to get container status \"7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5\": rpc error: code = NotFound desc = could not find container \"7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5\": container with ID starting with 7697014180f66a36e3ea4d6327efb6f4452d726c7d076e10453864280a7de4e5 not found: ID does not exist" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.752139 4687 scope.go:117] "RemoveContainer" containerID="78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8" Feb 28 09:36:55 crc kubenswrapper[4687]: E0228 09:36:55.752450 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8\": container with ID starting with 78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8 not found: ID does not exist" containerID="78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.752470 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8"} err="failed to get container status \"78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8\": rpc error: code = NotFound desc = could not find container \"78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8\": container with ID starting with 78a0d01836feec22d504f4602399c33502eb162b504d46eb3de0ba4b58bda2c8 not found: ID does not exist" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.752484 4687 scope.go:117] "RemoveContainer" containerID="bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec" Feb 28 09:36:55 crc kubenswrapper[4687]: E0228 09:36:55.752681 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec\": container with ID starting with bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec not found: ID does not exist" containerID="bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec" Feb 28 09:36:55 crc kubenswrapper[4687]: I0228 09:36:55.752697 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec"} err="failed to get container status \"bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec\": rpc error: code = NotFound desc = could not find container \"bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec\": container with ID starting with bc8e36cf6d0d4961d2efa08dbf603161c84623b1b5666e144b7975ea7932ceec not found: ID does not exist" Feb 28 09:36:56 crc kubenswrapper[4687]: I0228 09:36:56.664264 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" path="/var/lib/kubelet/pods/c06f8d8a-def7-4ddd-9c6d-626677a91dc3/volumes" Feb 28 09:36:58 crc kubenswrapper[4687]: I0228 09:36:58.709543 4687 generic.go:334] "Generic (PLEG): container finished" podID="6fb2570c-4ba8-41f6-83a3-038b8ab54177" containerID="761c9022302299ee0a45296c77d0a50f37477ba60c0397fab9d3d437bfdb0837" exitCode=0 Feb 28 09:36:58 crc kubenswrapper[4687]: I0228 09:36:58.709617 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" event={"ID":"6fb2570c-4ba8-41f6-83a3-038b8ab54177","Type":"ContainerDied","Data":"761c9022302299ee0a45296c77d0a50f37477ba60c0397fab9d3d437bfdb0837"} Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.025620 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.165604 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-secret-0\") pod \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.165758 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-inventory\") pod \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.165789 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzjp5\" (UniqueName: \"kubernetes.io/projected/6fb2570c-4ba8-41f6-83a3-038b8ab54177-kube-api-access-zzjp5\") pod \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.165822 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-combined-ca-bundle\") pod \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.165911 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-ssh-key-openstack-edpm-ipam\") pod \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\" (UID: \"6fb2570c-4ba8-41f6-83a3-038b8ab54177\") " Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.175401 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6fb2570c-4ba8-41f6-83a3-038b8ab54177" (UID: "6fb2570c-4ba8-41f6-83a3-038b8ab54177"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.187177 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb2570c-4ba8-41f6-83a3-038b8ab54177-kube-api-access-zzjp5" (OuterVolumeSpecName: "kube-api-access-zzjp5") pod "6fb2570c-4ba8-41f6-83a3-038b8ab54177" (UID: "6fb2570c-4ba8-41f6-83a3-038b8ab54177"). InnerVolumeSpecName "kube-api-access-zzjp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.204198 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-inventory" (OuterVolumeSpecName: "inventory") pod "6fb2570c-4ba8-41f6-83a3-038b8ab54177" (UID: "6fb2570c-4ba8-41f6-83a3-038b8ab54177"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.222105 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6fb2570c-4ba8-41f6-83a3-038b8ab54177" (UID: "6fb2570c-4ba8-41f6-83a3-038b8ab54177"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.235095 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6fb2570c-4ba8-41f6-83a3-038b8ab54177" (UID: "6fb2570c-4ba8-41f6-83a3-038b8ab54177"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.267567 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.267591 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzjp5\" (UniqueName: \"kubernetes.io/projected/6fb2570c-4ba8-41f6-83a3-038b8ab54177-kube-api-access-zzjp5\") on node \"crc\" DevicePath \"\"" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.267600 4687 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.267609 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.267619 4687 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6fb2570c-4ba8-41f6-83a3-038b8ab54177-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.724491 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" event={"ID":"6fb2570c-4ba8-41f6-83a3-038b8ab54177","Type":"ContainerDied","Data":"c13b7151c33e81c4122aa4e1f6779cb68b675b630d9eb22db9b3529915667deb"} Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.724530 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c13b7151c33e81c4122aa4e1f6779cb68b675b630d9eb22db9b3529915667deb" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.724533 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803147 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x"] Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803453 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="extract-content" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803479 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="extract-content" Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803490 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="registry-server" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803497 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="registry-server" Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803506 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="registry-server" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803512 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="registry-server" Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803524 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb2570c-4ba8-41f6-83a3-038b8ab54177" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803530 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb2570c-4ba8-41f6-83a3-038b8ab54177" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803542 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="extract-content" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803547 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="extract-content" Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803560 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="extract-utilities" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803566 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="extract-utilities" Feb 28 09:37:00 crc kubenswrapper[4687]: E0228 09:37:00.803587 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="extract-utilities" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803593 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="extract-utilities" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803727 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb2570c-4ba8-41f6-83a3-038b8ab54177" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803741 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c7d2dc-b5e8-454c-92ea-57f7d2465681" containerName="registry-server" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.803755 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06f8d8a-def7-4ddd-9c6d-626677a91dc3" containerName="registry-server" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.804261 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.805432 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.805769 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.805985 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.806182 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.807151 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.814481 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x"] Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.814564 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.814776 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.875357 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.875544 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svt52\" (UniqueName: \"kubernetes.io/projected/b0b65af5-abae-4587-abda-dfda34ed0d0b-kube-api-access-svt52\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.875643 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.875725 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.875829 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.875917 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.876003 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.876133 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.876206 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.876352 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.876393 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.977910 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.977958 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.977984 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978012 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978105 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978123 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978165 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978184 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978203 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978260 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svt52\" (UniqueName: \"kubernetes.io/projected/b0b65af5-abae-4587-abda-dfda34ed0d0b-kube-api-access-svt52\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.978278 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.979249 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.981383 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.981408 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.981639 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.981785 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.982103 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.982394 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.983136 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.983627 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.983762 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:00 crc kubenswrapper[4687]: I0228 09:37:00.992240 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svt52\" (UniqueName: \"kubernetes.io/projected/b0b65af5-abae-4587-abda-dfda34ed0d0b-kube-api-access-svt52\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dv48x\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:01 crc kubenswrapper[4687]: I0228 09:37:01.122614 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:37:01 crc kubenswrapper[4687]: I0228 09:37:01.563261 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x"] Feb 28 09:37:01 crc kubenswrapper[4687]: I0228 09:37:01.735918 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" event={"ID":"b0b65af5-abae-4587-abda-dfda34ed0d0b","Type":"ContainerStarted","Data":"5da2b80d6ab52e5053a0c3a86f1014ab25e9ae02e633cde3f460ce832bd1b194"} Feb 28 09:37:02 crc kubenswrapper[4687]: I0228 09:37:02.742845 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" event={"ID":"b0b65af5-abae-4587-abda-dfda34ed0d0b","Type":"ContainerStarted","Data":"750bf26a7386f185a01c7066b0e39cbe27c17f1ddbecebe34f4d64cda95fa7ae"} Feb 28 09:37:02 crc kubenswrapper[4687]: I0228 09:37:02.776281 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" podStartSLOduration=2.32676874 podStartE2EDuration="2.77626614s" podCreationTimestamp="2026-02-28 09:37:00 +0000 UTC" firstStartedPulling="2026-02-28 09:37:01.565464182 +0000 UTC m=+2013.256033520" lastFinishedPulling="2026-02-28 09:37:02.014961583 +0000 UTC m=+2013.705530920" observedRunningTime="2026-02-28 09:37:02.760357997 +0000 UTC m=+2014.450927353" watchObservedRunningTime="2026-02-28 09:37:02.77626614 +0000 UTC m=+2014.466835477" Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.002444 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.003003 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.003069 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.003531 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dae4760c42bdf35ff81f24568deadc7a5d5f1d56cf50f222534d7b17be296984"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.003583 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://dae4760c42bdf35ff81f24568deadc7a5d5f1d56cf50f222534d7b17be296984" gracePeriod=600 Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.909490 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="dae4760c42bdf35ff81f24568deadc7a5d5f1d56cf50f222534d7b17be296984" exitCode=0 Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.909562 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"dae4760c42bdf35ff81f24568deadc7a5d5f1d56cf50f222534d7b17be296984"} Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.910277 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9"} Feb 28 09:37:25 crc kubenswrapper[4687]: I0228 09:37:25.910300 4687 scope.go:117] "RemoveContainer" containerID="3553b6238c39af6623c9b43e30d6d879f25a9c6400ada40d42773d6c033a446f" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.131614 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537858-prsjl"] Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.133418 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.135831 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.136373 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.136590 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.146660 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537858-prsjl"] Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.217637 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltr2j\" (UniqueName: \"kubernetes.io/projected/e761c01a-dc8b-4439-8539-e65e64d6c8bb-kube-api-access-ltr2j\") pod \"auto-csr-approver-29537858-prsjl\" (UID: \"e761c01a-dc8b-4439-8539-e65e64d6c8bb\") " pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.319177 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltr2j\" (UniqueName: \"kubernetes.io/projected/e761c01a-dc8b-4439-8539-e65e64d6c8bb-kube-api-access-ltr2j\") pod \"auto-csr-approver-29537858-prsjl\" (UID: \"e761c01a-dc8b-4439-8539-e65e64d6c8bb\") " pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.334914 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltr2j\" (UniqueName: \"kubernetes.io/projected/e761c01a-dc8b-4439-8539-e65e64d6c8bb-kube-api-access-ltr2j\") pod \"auto-csr-approver-29537858-prsjl\" (UID: \"e761c01a-dc8b-4439-8539-e65e64d6c8bb\") " pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.465724 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.831937 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537858-prsjl"] Feb 28 09:38:00 crc kubenswrapper[4687]: I0228 09:38:00.837398 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:38:01 crc kubenswrapper[4687]: I0228 09:38:01.144833 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537858-prsjl" event={"ID":"e761c01a-dc8b-4439-8539-e65e64d6c8bb","Type":"ContainerStarted","Data":"9bd12e0d0c685f58395c7d2ae8a777d18e83180cf9f6f1454ef8887a9b71b5d7"} Feb 28 09:38:02 crc kubenswrapper[4687]: I0228 09:38:02.152573 4687 generic.go:334] "Generic (PLEG): container finished" podID="e761c01a-dc8b-4439-8539-e65e64d6c8bb" containerID="3142e563c901316ed2aea43a72ef45305de960166165e018b8112e00fae7adc9" exitCode=0 Feb 28 09:38:02 crc kubenswrapper[4687]: I0228 09:38:02.152672 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537858-prsjl" event={"ID":"e761c01a-dc8b-4439-8539-e65e64d6c8bb","Type":"ContainerDied","Data":"3142e563c901316ed2aea43a72ef45305de960166165e018b8112e00fae7adc9"} Feb 28 09:38:03 crc kubenswrapper[4687]: I0228 09:38:03.418423 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:03 crc kubenswrapper[4687]: I0228 09:38:03.464076 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltr2j\" (UniqueName: \"kubernetes.io/projected/e761c01a-dc8b-4439-8539-e65e64d6c8bb-kube-api-access-ltr2j\") pod \"e761c01a-dc8b-4439-8539-e65e64d6c8bb\" (UID: \"e761c01a-dc8b-4439-8539-e65e64d6c8bb\") " Feb 28 09:38:03 crc kubenswrapper[4687]: I0228 09:38:03.474404 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e761c01a-dc8b-4439-8539-e65e64d6c8bb-kube-api-access-ltr2j" (OuterVolumeSpecName: "kube-api-access-ltr2j") pod "e761c01a-dc8b-4439-8539-e65e64d6c8bb" (UID: "e761c01a-dc8b-4439-8539-e65e64d6c8bb"). InnerVolumeSpecName "kube-api-access-ltr2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:38:03 crc kubenswrapper[4687]: I0228 09:38:03.567831 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltr2j\" (UniqueName: \"kubernetes.io/projected/e761c01a-dc8b-4439-8539-e65e64d6c8bb-kube-api-access-ltr2j\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:04 crc kubenswrapper[4687]: I0228 09:38:04.165654 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537858-prsjl" event={"ID":"e761c01a-dc8b-4439-8539-e65e64d6c8bb","Type":"ContainerDied","Data":"9bd12e0d0c685f58395c7d2ae8a777d18e83180cf9f6f1454ef8887a9b71b5d7"} Feb 28 09:38:04 crc kubenswrapper[4687]: I0228 09:38:04.165690 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537858-prsjl" Feb 28 09:38:04 crc kubenswrapper[4687]: I0228 09:38:04.165693 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bd12e0d0c685f58395c7d2ae8a777d18e83180cf9f6f1454ef8887a9b71b5d7" Feb 28 09:38:04 crc kubenswrapper[4687]: I0228 09:38:04.470591 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537852-2zdp6"] Feb 28 09:38:04 crc kubenswrapper[4687]: I0228 09:38:04.475883 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537852-2zdp6"] Feb 28 09:38:04 crc kubenswrapper[4687]: I0228 09:38:04.664571 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958ee491-1300-475e-9410-521ebb3f5078" path="/var/lib/kubelet/pods/958ee491-1300-475e-9410-521ebb3f5078/volumes" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.120930 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lq5xm"] Feb 28 09:38:21 crc kubenswrapper[4687]: E0228 09:38:21.121811 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e761c01a-dc8b-4439-8539-e65e64d6c8bb" containerName="oc" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.121823 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e761c01a-dc8b-4439-8539-e65e64d6c8bb" containerName="oc" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.122002 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e761c01a-dc8b-4439-8539-e65e64d6c8bb" containerName="oc" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.123184 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.133279 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lq5xm"] Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.208244 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbc4\" (UniqueName: \"kubernetes.io/projected/3a80852c-64b9-446a-a5ee-4fc2310301ce-kube-api-access-grbc4\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.208292 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-utilities\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.208333 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-catalog-content\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.310165 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbc4\" (UniqueName: \"kubernetes.io/projected/3a80852c-64b9-446a-a5ee-4fc2310301ce-kube-api-access-grbc4\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.310204 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-utilities\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.310238 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-catalog-content\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.310776 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-catalog-content\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.310998 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-utilities\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.328917 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbc4\" (UniqueName: \"kubernetes.io/projected/3a80852c-64b9-446a-a5ee-4fc2310301ce-kube-api-access-grbc4\") pod \"community-operators-lq5xm\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.444545 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:21 crc kubenswrapper[4687]: I0228 09:38:21.888000 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lq5xm"] Feb 28 09:38:22 crc kubenswrapper[4687]: I0228 09:38:22.301753 4687 generic.go:334] "Generic (PLEG): container finished" podID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerID="cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa" exitCode=0 Feb 28 09:38:22 crc kubenswrapper[4687]: I0228 09:38:22.301792 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq5xm" event={"ID":"3a80852c-64b9-446a-a5ee-4fc2310301ce","Type":"ContainerDied","Data":"cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa"} Feb 28 09:38:22 crc kubenswrapper[4687]: I0228 09:38:22.302104 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq5xm" event={"ID":"3a80852c-64b9-446a-a5ee-4fc2310301ce","Type":"ContainerStarted","Data":"c9666c9e1129c1d69abd3e27999eca1a1bd036aaf79cd0ab653d8ecebefd0aad"} Feb 28 09:38:23 crc kubenswrapper[4687]: I0228 09:38:23.313849 4687 generic.go:334] "Generic (PLEG): container finished" podID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerID="50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119" exitCode=0 Feb 28 09:38:23 crc kubenswrapper[4687]: I0228 09:38:23.313928 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq5xm" event={"ID":"3a80852c-64b9-446a-a5ee-4fc2310301ce","Type":"ContainerDied","Data":"50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119"} Feb 28 09:38:24 crc kubenswrapper[4687]: I0228 09:38:24.322834 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq5xm" event={"ID":"3a80852c-64b9-446a-a5ee-4fc2310301ce","Type":"ContainerStarted","Data":"6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5"} Feb 28 09:38:24 crc kubenswrapper[4687]: I0228 09:38:24.345737 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lq5xm" podStartSLOduration=1.896326854 podStartE2EDuration="3.345721917s" podCreationTimestamp="2026-02-28 09:38:21 +0000 UTC" firstStartedPulling="2026-02-28 09:38:22.303164542 +0000 UTC m=+2093.993733878" lastFinishedPulling="2026-02-28 09:38:23.752559604 +0000 UTC m=+2095.443128941" observedRunningTime="2026-02-28 09:38:24.343914209 +0000 UTC m=+2096.034483547" watchObservedRunningTime="2026-02-28 09:38:24.345721917 +0000 UTC m=+2096.036291254" Feb 28 09:38:31 crc kubenswrapper[4687]: I0228 09:38:31.445185 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:31 crc kubenswrapper[4687]: I0228 09:38:31.446202 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:31 crc kubenswrapper[4687]: I0228 09:38:31.487438 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:32 crc kubenswrapper[4687]: I0228 09:38:32.428563 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:32 crc kubenswrapper[4687]: I0228 09:38:32.476249 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lq5xm"] Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.402069 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lq5xm" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="registry-server" containerID="cri-o://6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5" gracePeriod=2 Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.795624 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.890420 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-utilities\") pod \"3a80852c-64b9-446a-a5ee-4fc2310301ce\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.890487 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-catalog-content\") pod \"3a80852c-64b9-446a-a5ee-4fc2310301ce\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.890540 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbc4\" (UniqueName: \"kubernetes.io/projected/3a80852c-64b9-446a-a5ee-4fc2310301ce-kube-api-access-grbc4\") pod \"3a80852c-64b9-446a-a5ee-4fc2310301ce\" (UID: \"3a80852c-64b9-446a-a5ee-4fc2310301ce\") " Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.890991 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-utilities" (OuterVolumeSpecName: "utilities") pod "3a80852c-64b9-446a-a5ee-4fc2310301ce" (UID: "3a80852c-64b9-446a-a5ee-4fc2310301ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.891187 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.896604 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a80852c-64b9-446a-a5ee-4fc2310301ce-kube-api-access-grbc4" (OuterVolumeSpecName: "kube-api-access-grbc4") pod "3a80852c-64b9-446a-a5ee-4fc2310301ce" (UID: "3a80852c-64b9-446a-a5ee-4fc2310301ce"). InnerVolumeSpecName "kube-api-access-grbc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.930965 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a80852c-64b9-446a-a5ee-4fc2310301ce" (UID: "3a80852c-64b9-446a-a5ee-4fc2310301ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.994118 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a80852c-64b9-446a-a5ee-4fc2310301ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:34 crc kubenswrapper[4687]: I0228 09:38:34.994157 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbc4\" (UniqueName: \"kubernetes.io/projected/3a80852c-64b9-446a-a5ee-4fc2310301ce-kube-api-access-grbc4\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.424770 4687 generic.go:334] "Generic (PLEG): container finished" podID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerID="6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5" exitCode=0 Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.424848 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq5xm" event={"ID":"3a80852c-64b9-446a-a5ee-4fc2310301ce","Type":"ContainerDied","Data":"6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5"} Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.424870 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lq5xm" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.424890 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lq5xm" event={"ID":"3a80852c-64b9-446a-a5ee-4fc2310301ce","Type":"ContainerDied","Data":"c9666c9e1129c1d69abd3e27999eca1a1bd036aaf79cd0ab653d8ecebefd0aad"} Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.424911 4687 scope.go:117] "RemoveContainer" containerID="6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.453984 4687 scope.go:117] "RemoveContainer" containerID="50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.458425 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lq5xm"] Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.466322 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lq5xm"] Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.479529 4687 scope.go:117] "RemoveContainer" containerID="cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.504175 4687 scope.go:117] "RemoveContainer" containerID="6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5" Feb 28 09:38:35 crc kubenswrapper[4687]: E0228 09:38:35.504591 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5\": container with ID starting with 6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5 not found: ID does not exist" containerID="6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.504662 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5"} err="failed to get container status \"6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5\": rpc error: code = NotFound desc = could not find container \"6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5\": container with ID starting with 6f7407bfc5ffe5c1bd9dd847d8716a74bbd40e1ad0262263836f0596e9a806f5 not found: ID does not exist" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.504690 4687 scope.go:117] "RemoveContainer" containerID="50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119" Feb 28 09:38:35 crc kubenswrapper[4687]: E0228 09:38:35.505058 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119\": container with ID starting with 50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119 not found: ID does not exist" containerID="50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.505084 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119"} err="failed to get container status \"50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119\": rpc error: code = NotFound desc = could not find container \"50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119\": container with ID starting with 50c3dc6ae42f9e0dc132934fb79e5c5fbc81e897680b10c2c4e6c614fdbfa119 not found: ID does not exist" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.505102 4687 scope.go:117] "RemoveContainer" containerID="cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa" Feb 28 09:38:35 crc kubenswrapper[4687]: E0228 09:38:35.505413 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa\": container with ID starting with cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa not found: ID does not exist" containerID="cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa" Feb 28 09:38:35 crc kubenswrapper[4687]: I0228 09:38:35.505438 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa"} err="failed to get container status \"cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa\": rpc error: code = NotFound desc = could not find container \"cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa\": container with ID starting with cd00bd5aad063e2370175a1d0f6f0e896833da930146a84187839db0357547fa not found: ID does not exist" Feb 28 09:38:36 crc kubenswrapper[4687]: I0228 09:38:36.664073 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" path="/var/lib/kubelet/pods/3a80852c-64b9-446a-a5ee-4fc2310301ce/volumes" Feb 28 09:38:43 crc kubenswrapper[4687]: I0228 09:38:43.250139 4687 scope.go:117] "RemoveContainer" containerID="0cf4c155a01ed143e132c679c863a135ae9ab88907072c7272392f9baebda177" Feb 28 09:38:46 crc kubenswrapper[4687]: I0228 09:38:46.500561 4687 generic.go:334] "Generic (PLEG): container finished" podID="b0b65af5-abae-4587-abda-dfda34ed0d0b" containerID="750bf26a7386f185a01c7066b0e39cbe27c17f1ddbecebe34f4d64cda95fa7ae" exitCode=0 Feb 28 09:38:46 crc kubenswrapper[4687]: I0228 09:38:46.500667 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" event={"ID":"b0b65af5-abae-4587-abda-dfda34ed0d0b","Type":"ContainerDied","Data":"750bf26a7386f185a01c7066b0e39cbe27c17f1ddbecebe34f4d64cda95fa7ae"} Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.901051 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.931480 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-inventory\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934133 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-2\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934171 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-combined-ca-bundle\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934203 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-extra-config-0\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934236 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-ssh-key-openstack-edpm-ipam\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934277 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svt52\" (UniqueName: \"kubernetes.io/projected/b0b65af5-abae-4587-abda-dfda34ed0d0b-kube-api-access-svt52\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934312 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-0\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934460 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-1\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934500 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-3\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934603 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-1\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.934655 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-0\") pod \"b0b65af5-abae-4587-abda-dfda34ed0d0b\" (UID: \"b0b65af5-abae-4587-abda-dfda34ed0d0b\") " Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.957100 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.957866 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b65af5-abae-4587-abda-dfda34ed0d0b-kube-api-access-svt52" (OuterVolumeSpecName: "kube-api-access-svt52") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "kube-api-access-svt52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.960840 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.961188 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.963105 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.967308 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-inventory" (OuterVolumeSpecName: "inventory") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.974287 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.976254 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.977309 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.979180 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:47 crc kubenswrapper[4687]: I0228 09:38:47.986408 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b0b65af5-abae-4587-abda-dfda34ed0d0b" (UID: "b0b65af5-abae-4587-abda-dfda34ed0d0b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037514 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037547 4687 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037561 4687 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037573 4687 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037581 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037589 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svt52\" (UniqueName: \"kubernetes.io/projected/b0b65af5-abae-4587-abda-dfda34ed0d0b-kube-api-access-svt52\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037597 4687 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037604 4687 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037618 4687 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037626 4687 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.037633 4687 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0b65af5-abae-4587-abda-dfda34ed0d0b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.518924 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" event={"ID":"b0b65af5-abae-4587-abda-dfda34ed0d0b","Type":"ContainerDied","Data":"5da2b80d6ab52e5053a0c3a86f1014ab25e9ae02e633cde3f460ce832bd1b194"} Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.518972 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5da2b80d6ab52e5053a0c3a86f1014ab25e9ae02e633cde3f460ce832bd1b194" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.519249 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dv48x" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.596909 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc"] Feb 28 09:38:48 crc kubenswrapper[4687]: E0228 09:38:48.597547 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="extract-content" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.597635 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="extract-content" Feb 28 09:38:48 crc kubenswrapper[4687]: E0228 09:38:48.597699 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="extract-utilities" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.597760 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="extract-utilities" Feb 28 09:38:48 crc kubenswrapper[4687]: E0228 09:38:48.597820 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="registry-server" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.597872 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="registry-server" Feb 28 09:38:48 crc kubenswrapper[4687]: E0228 09:38:48.597928 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b65af5-abae-4587-abda-dfda34ed0d0b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.597984 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b65af5-abae-4587-abda-dfda34ed0d0b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.598205 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a80852c-64b9-446a-a5ee-4fc2310301ce" containerName="registry-server" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.598264 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b65af5-abae-4587-abda-dfda34ed0d0b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.598976 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.601348 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.601656 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.601788 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ffgb4" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.601950 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.604160 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.606517 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc"] Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.652847 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzb6\" (UniqueName: \"kubernetes.io/projected/6f4d944c-dd63-414e-8886-5b38a982c01a-kube-api-access-ltzb6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.652894 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.653250 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.653319 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.653484 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.653525 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.653546 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755113 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755461 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755490 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755587 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzb6\" (UniqueName: \"kubernetes.io/projected/6f4d944c-dd63-414e-8886-5b38a982c01a-kube-api-access-ltzb6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755628 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755774 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.755820 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.761385 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.761631 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.761876 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.762526 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.762538 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.763487 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.770741 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzb6\" (UniqueName: \"kubernetes.io/projected/6f4d944c-dd63-414e-8886-5b38a982c01a-kube-api-access-ltzb6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:48 crc kubenswrapper[4687]: I0228 09:38:48.916157 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:38:49 crc kubenswrapper[4687]: I0228 09:38:49.359839 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc"] Feb 28 09:38:49 crc kubenswrapper[4687]: I0228 09:38:49.527719 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" event={"ID":"6f4d944c-dd63-414e-8886-5b38a982c01a","Type":"ContainerStarted","Data":"193f1c27ccb436787725973ba5a0fb8d27ad1d854a37341201d3f8cb142bb33f"} Feb 28 09:38:50 crc kubenswrapper[4687]: I0228 09:38:50.536083 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" event={"ID":"6f4d944c-dd63-414e-8886-5b38a982c01a","Type":"ContainerStarted","Data":"7ca85360233be00ed92a299ca12b0c28f0cdbfd49a30d5380511d0d9e3024c01"} Feb 28 09:39:25 crc kubenswrapper[4687]: I0228 09:39:25.002439 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:39:25 crc kubenswrapper[4687]: I0228 09:39:25.003131 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:39:55 crc kubenswrapper[4687]: I0228 09:39:55.002428 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:39:55 crc kubenswrapper[4687]: I0228 09:39:55.003299 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.155098 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" podStartSLOduration=71.639376426 podStartE2EDuration="1m12.15508162s" podCreationTimestamp="2026-02-28 09:38:48 +0000 UTC" firstStartedPulling="2026-02-28 09:38:49.361113413 +0000 UTC m=+2121.051682750" lastFinishedPulling="2026-02-28 09:38:49.876818606 +0000 UTC m=+2121.567387944" observedRunningTime="2026-02-28 09:38:50.555448247 +0000 UTC m=+2122.246017584" watchObservedRunningTime="2026-02-28 09:40:00.15508162 +0000 UTC m=+2191.845650958" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.158032 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537860-slrkh"] Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.159075 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.160890 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.161326 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.162262 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.170606 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537860-slrkh"] Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.187072 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fvv\" (UniqueName: \"kubernetes.io/projected/4e8df979-76aa-443b-a961-7a0f1252e386-kube-api-access-w5fvv\") pod \"auto-csr-approver-29537860-slrkh\" (UID: \"4e8df979-76aa-443b-a961-7a0f1252e386\") " pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.288711 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fvv\" (UniqueName: \"kubernetes.io/projected/4e8df979-76aa-443b-a961-7a0f1252e386-kube-api-access-w5fvv\") pod \"auto-csr-approver-29537860-slrkh\" (UID: \"4e8df979-76aa-443b-a961-7a0f1252e386\") " pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.306150 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fvv\" (UniqueName: \"kubernetes.io/projected/4e8df979-76aa-443b-a961-7a0f1252e386-kube-api-access-w5fvv\") pod \"auto-csr-approver-29537860-slrkh\" (UID: \"4e8df979-76aa-443b-a961-7a0f1252e386\") " pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.474488 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:00 crc kubenswrapper[4687]: I0228 09:40:00.878868 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537860-slrkh"] Feb 28 09:40:01 crc kubenswrapper[4687]: I0228 09:40:01.139079 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537860-slrkh" event={"ID":"4e8df979-76aa-443b-a961-7a0f1252e386","Type":"ContainerStarted","Data":"8b2b7048512c250509806d250183450a304350937511e49baa9e0bca9c668302"} Feb 28 09:40:03 crc kubenswrapper[4687]: I0228 09:40:03.166244 4687 generic.go:334] "Generic (PLEG): container finished" podID="4e8df979-76aa-443b-a961-7a0f1252e386" containerID="1c13cdb34ab669145698e419324ec5c2e3bd745e94064c63d5c4c651f0515c25" exitCode=0 Feb 28 09:40:03 crc kubenswrapper[4687]: I0228 09:40:03.166307 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537860-slrkh" event={"ID":"4e8df979-76aa-443b-a961-7a0f1252e386","Type":"ContainerDied","Data":"1c13cdb34ab669145698e419324ec5c2e3bd745e94064c63d5c4c651f0515c25"} Feb 28 09:40:04 crc kubenswrapper[4687]: I0228 09:40:04.458488 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:04 crc kubenswrapper[4687]: I0228 09:40:04.589204 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5fvv\" (UniqueName: \"kubernetes.io/projected/4e8df979-76aa-443b-a961-7a0f1252e386-kube-api-access-w5fvv\") pod \"4e8df979-76aa-443b-a961-7a0f1252e386\" (UID: \"4e8df979-76aa-443b-a961-7a0f1252e386\") " Feb 28 09:40:04 crc kubenswrapper[4687]: I0228 09:40:04.597395 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8df979-76aa-443b-a961-7a0f1252e386-kube-api-access-w5fvv" (OuterVolumeSpecName: "kube-api-access-w5fvv") pod "4e8df979-76aa-443b-a961-7a0f1252e386" (UID: "4e8df979-76aa-443b-a961-7a0f1252e386"). InnerVolumeSpecName "kube-api-access-w5fvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:40:04 crc kubenswrapper[4687]: I0228 09:40:04.691627 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5fvv\" (UniqueName: \"kubernetes.io/projected/4e8df979-76aa-443b-a961-7a0f1252e386-kube-api-access-w5fvv\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:05 crc kubenswrapper[4687]: I0228 09:40:05.182325 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537860-slrkh" event={"ID":"4e8df979-76aa-443b-a961-7a0f1252e386","Type":"ContainerDied","Data":"8b2b7048512c250509806d250183450a304350937511e49baa9e0bca9c668302"} Feb 28 09:40:05 crc kubenswrapper[4687]: I0228 09:40:05.182571 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b2b7048512c250509806d250183450a304350937511e49baa9e0bca9c668302" Feb 28 09:40:05 crc kubenswrapper[4687]: I0228 09:40:05.182370 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537860-slrkh" Feb 28 09:40:05 crc kubenswrapper[4687]: I0228 09:40:05.527882 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537854-xj6v7"] Feb 28 09:40:05 crc kubenswrapper[4687]: I0228 09:40:05.532879 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537854-xj6v7"] Feb 28 09:40:06 crc kubenswrapper[4687]: I0228 09:40:06.669531 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716080da-ec3b-4498-b033-a048e7ca9d11" path="/var/lib/kubelet/pods/716080da-ec3b-4498-b033-a048e7ca9d11/volumes" Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.002472 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.003277 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.003336 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.004015 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.004089 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" gracePeriod=600 Feb 28 09:40:25 crc kubenswrapper[4687]: E0228 09:40:25.120789 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.349748 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" exitCode=0 Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.349799 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9"} Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.349843 4687 scope.go:117] "RemoveContainer" containerID="dae4760c42bdf35ff81f24568deadc7a5d5f1d56cf50f222534d7b17be296984" Feb 28 09:40:25 crc kubenswrapper[4687]: I0228 09:40:25.350466 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:40:25 crc kubenswrapper[4687]: E0228 09:40:25.350788 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:40:34 crc kubenswrapper[4687]: I0228 09:40:34.424130 4687 generic.go:334] "Generic (PLEG): container finished" podID="6f4d944c-dd63-414e-8886-5b38a982c01a" containerID="7ca85360233be00ed92a299ca12b0c28f0cdbfd49a30d5380511d0d9e3024c01" exitCode=0 Feb 28 09:40:34 crc kubenswrapper[4687]: I0228 09:40:34.424229 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" event={"ID":"6f4d944c-dd63-414e-8886-5b38a982c01a","Type":"ContainerDied","Data":"7ca85360233be00ed92a299ca12b0c28f0cdbfd49a30d5380511d0d9e3024c01"} Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.779955 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.825418 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-telemetry-combined-ca-bundle\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.830226 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927142 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-2\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927233 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltzb6\" (UniqueName: \"kubernetes.io/projected/6f4d944c-dd63-414e-8886-5b38a982c01a-kube-api-access-ltzb6\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927312 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-1\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927355 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-0\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927408 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-inventory\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927455 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ssh-key-openstack-edpm-ipam\") pod \"6f4d944c-dd63-414e-8886-5b38a982c01a\" (UID: \"6f4d944c-dd63-414e-8886-5b38a982c01a\") " Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.927969 4687 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.932515 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4d944c-dd63-414e-8886-5b38a982c01a-kube-api-access-ltzb6" (OuterVolumeSpecName: "kube-api-access-ltzb6") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "kube-api-access-ltzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.952694 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.952749 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.952823 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.953763 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:40:35 crc kubenswrapper[4687]: I0228 09:40:35.960209 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-inventory" (OuterVolumeSpecName: "inventory") pod "6f4d944c-dd63-414e-8886-5b38a982c01a" (UID: "6f4d944c-dd63-414e-8886-5b38a982c01a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.030510 4687 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.030563 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltzb6\" (UniqueName: \"kubernetes.io/projected/6f4d944c-dd63-414e-8886-5b38a982c01a-kube-api-access-ltzb6\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.030582 4687 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.030827 4687 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.030839 4687 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-inventory\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.030848 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6f4d944c-dd63-414e-8886-5b38a982c01a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.450068 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" event={"ID":"6f4d944c-dd63-414e-8886-5b38a982c01a","Type":"ContainerDied","Data":"193f1c27ccb436787725973ba5a0fb8d27ad1d854a37341201d3f8cb142bb33f"} Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.450115 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="193f1c27ccb436787725973ba5a0fb8d27ad1d854a37341201d3f8cb142bb33f" Feb 28 09:40:36 crc kubenswrapper[4687]: I0228 09:40:36.450208 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc" Feb 28 09:40:38 crc kubenswrapper[4687]: I0228 09:40:38.661119 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:40:38 crc kubenswrapper[4687]: E0228 09:40:38.661605 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:40:43 crc kubenswrapper[4687]: I0228 09:40:43.327157 4687 scope.go:117] "RemoveContainer" containerID="fdb258840516d4030e15d02445e35edb31906d47bf058002d6dfe8d312c52131" Feb 28 09:40:53 crc kubenswrapper[4687]: I0228 09:40:53.656820 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:40:53 crc kubenswrapper[4687]: E0228 09:40:53.657816 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:41:06 crc kubenswrapper[4687]: I0228 09:41:06.657383 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:41:06 crc kubenswrapper[4687]: E0228 09:41:06.658123 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:41:20 crc kubenswrapper[4687]: I0228 09:41:20.657009 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:41:20 crc kubenswrapper[4687]: E0228 09:41:20.657505 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.522641 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 28 09:41:23 crc kubenswrapper[4687]: E0228 09:41:23.523746 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4d944c-dd63-414e-8886-5b38a982c01a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.523826 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4d944c-dd63-414e-8886-5b38a982c01a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 28 09:41:23 crc kubenswrapper[4687]: E0228 09:41:23.523896 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8df979-76aa-443b-a961-7a0f1252e386" containerName="oc" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.523949 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8df979-76aa-443b-a961-7a0f1252e386" containerName="oc" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.524173 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8df979-76aa-443b-a961-7a0f1252e386" containerName="oc" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.524257 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4d944c-dd63-414e-8886-5b38a982c01a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.524832 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.526628 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.526664 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.526715 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.526839 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wqjcg" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.531232 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.570647 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.570788 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.570883 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-config-data\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672570 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672630 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672689 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672731 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-config-data\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672832 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672865 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672884 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.672907 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4x28\" (UniqueName: \"kubernetes.io/projected/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-kube-api-access-p4x28\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.673070 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.673757 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-config-data\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.674322 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.678334 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.774770 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.774810 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.774829 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.774849 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4x28\" (UniqueName: \"kubernetes.io/projected/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-kube-api-access-p4x28\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.774939 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.775005 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.775157 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.775286 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.775413 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.778123 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.778448 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.789112 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4x28\" (UniqueName: \"kubernetes.io/projected/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-kube-api-access-p4x28\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.793902 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " pod="openstack/tempest-tests-tempest" Feb 28 09:41:23 crc kubenswrapper[4687]: I0228 09:41:23.841503 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 28 09:41:24 crc kubenswrapper[4687]: I0228 09:41:24.208834 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 28 09:41:24 crc kubenswrapper[4687]: W0228 09:41:24.210459 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3d191c1_f8c8_455f_848c_a3d0a7caaf81.slice/crio-f26d7650fac8251cf40cacca947bbb0db3dc3a01c49d31464219350200af564f WatchSource:0}: Error finding container f26d7650fac8251cf40cacca947bbb0db3dc3a01c49d31464219350200af564f: Status 404 returned error can't find the container with id f26d7650fac8251cf40cacca947bbb0db3dc3a01c49d31464219350200af564f Feb 28 09:41:24 crc kubenswrapper[4687]: I0228 09:41:24.818819 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3d191c1-f8c8-455f-848c-a3d0a7caaf81","Type":"ContainerStarted","Data":"f26d7650fac8251cf40cacca947bbb0db3dc3a01c49d31464219350200af564f"} Feb 28 09:41:33 crc kubenswrapper[4687]: I0228 09:41:33.661757 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:41:33 crc kubenswrapper[4687]: E0228 09:41:33.662785 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:41:46 crc kubenswrapper[4687]: I0228 09:41:46.656802 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:41:46 crc kubenswrapper[4687]: E0228 09:41:46.657559 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:41:56 crc kubenswrapper[4687]: E0228 09:41:56.224315 4687 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 28 09:41:56 crc kubenswrapper[4687]: E0228 09:41:56.224953 4687 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4x28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e3d191c1-f8c8-455f-848c-a3d0a7caaf81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 28 09:41:56 crc kubenswrapper[4687]: E0228 09:41:56.226150 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e3d191c1-f8c8-455f-848c-a3d0a7caaf81" Feb 28 09:41:57 crc kubenswrapper[4687]: E0228 09:41:57.033120 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e3d191c1-f8c8-455f-848c-a3d0a7caaf81" Feb 28 09:41:57 crc kubenswrapper[4687]: I0228 09:41:57.657414 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:41:57 crc kubenswrapper[4687]: E0228 09:41:57.657709 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.137566 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537862-mmzzs"] Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.140352 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.142350 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.142355 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.142358 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.147872 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537862-mmzzs"] Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.268609 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7tl\" (UniqueName: \"kubernetes.io/projected/56adcdd4-04e5-427f-a293-561e788041fb-kube-api-access-8r7tl\") pod \"auto-csr-approver-29537862-mmzzs\" (UID: \"56adcdd4-04e5-427f-a293-561e788041fb\") " pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.369978 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7tl\" (UniqueName: \"kubernetes.io/projected/56adcdd4-04e5-427f-a293-561e788041fb-kube-api-access-8r7tl\") pod \"auto-csr-approver-29537862-mmzzs\" (UID: \"56adcdd4-04e5-427f-a293-561e788041fb\") " pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.385871 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7tl\" (UniqueName: \"kubernetes.io/projected/56adcdd4-04e5-427f-a293-561e788041fb-kube-api-access-8r7tl\") pod \"auto-csr-approver-29537862-mmzzs\" (UID: \"56adcdd4-04e5-427f-a293-561e788041fb\") " pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.455189 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:00 crc kubenswrapper[4687]: I0228 09:42:00.848671 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537862-mmzzs"] Feb 28 09:42:01 crc kubenswrapper[4687]: I0228 09:42:01.062130 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" event={"ID":"56adcdd4-04e5-427f-a293-561e788041fb","Type":"ContainerStarted","Data":"416a3931616738bf0ee3d4694ea5407ce3697dff118f598e6b62f8cec2616619"} Feb 28 09:42:02 crc kubenswrapper[4687]: I0228 09:42:02.070107 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" event={"ID":"56adcdd4-04e5-427f-a293-561e788041fb","Type":"ContainerStarted","Data":"0cc4e2d2f047062a2998acae3525c7abe8a05ff34543fc14766427ed720cc3ac"} Feb 28 09:42:02 crc kubenswrapper[4687]: I0228 09:42:02.082801 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" podStartSLOduration=1.266571125 podStartE2EDuration="2.082786811s" podCreationTimestamp="2026-02-28 09:42:00 +0000 UTC" firstStartedPulling="2026-02-28 09:42:00.876622982 +0000 UTC m=+2312.567192319" lastFinishedPulling="2026-02-28 09:42:01.692838668 +0000 UTC m=+2313.383408005" observedRunningTime="2026-02-28 09:42:02.079173088 +0000 UTC m=+2313.769742425" watchObservedRunningTime="2026-02-28 09:42:02.082786811 +0000 UTC m=+2313.773356148" Feb 28 09:42:03 crc kubenswrapper[4687]: I0228 09:42:03.077790 4687 generic.go:334] "Generic (PLEG): container finished" podID="56adcdd4-04e5-427f-a293-561e788041fb" containerID="0cc4e2d2f047062a2998acae3525c7abe8a05ff34543fc14766427ed720cc3ac" exitCode=0 Feb 28 09:42:03 crc kubenswrapper[4687]: I0228 09:42:03.077859 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" event={"ID":"56adcdd4-04e5-427f-a293-561e788041fb","Type":"ContainerDied","Data":"0cc4e2d2f047062a2998acae3525c7abe8a05ff34543fc14766427ed720cc3ac"} Feb 28 09:42:04 crc kubenswrapper[4687]: I0228 09:42:04.330698 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:04 crc kubenswrapper[4687]: I0228 09:42:04.531582 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r7tl\" (UniqueName: \"kubernetes.io/projected/56adcdd4-04e5-427f-a293-561e788041fb-kube-api-access-8r7tl\") pod \"56adcdd4-04e5-427f-a293-561e788041fb\" (UID: \"56adcdd4-04e5-427f-a293-561e788041fb\") " Feb 28 09:42:04 crc kubenswrapper[4687]: I0228 09:42:04.536906 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56adcdd4-04e5-427f-a293-561e788041fb-kube-api-access-8r7tl" (OuterVolumeSpecName: "kube-api-access-8r7tl") pod "56adcdd4-04e5-427f-a293-561e788041fb" (UID: "56adcdd4-04e5-427f-a293-561e788041fb"). InnerVolumeSpecName "kube-api-access-8r7tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:42:04 crc kubenswrapper[4687]: I0228 09:42:04.633894 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r7tl\" (UniqueName: \"kubernetes.io/projected/56adcdd4-04e5-427f-a293-561e788041fb-kube-api-access-8r7tl\") on node \"crc\" DevicePath \"\"" Feb 28 09:42:05 crc kubenswrapper[4687]: I0228 09:42:05.092707 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" Feb 28 09:42:05 crc kubenswrapper[4687]: I0228 09:42:05.092675 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537862-mmzzs" event={"ID":"56adcdd4-04e5-427f-a293-561e788041fb","Type":"ContainerDied","Data":"416a3931616738bf0ee3d4694ea5407ce3697dff118f598e6b62f8cec2616619"} Feb 28 09:42:05 crc kubenswrapper[4687]: I0228 09:42:05.093193 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="416a3931616738bf0ee3d4694ea5407ce3697dff118f598e6b62f8cec2616619" Feb 28 09:42:05 crc kubenswrapper[4687]: I0228 09:42:05.133257 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537856-gvn4m"] Feb 28 09:42:05 crc kubenswrapper[4687]: I0228 09:42:05.140507 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537856-gvn4m"] Feb 28 09:42:06 crc kubenswrapper[4687]: I0228 09:42:06.665266 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5278aa8f-5ca9-4e1c-b485-9e771d15c63d" path="/var/lib/kubelet/pods/5278aa8f-5ca9-4e1c-b485-9e771d15c63d/volumes" Feb 28 09:42:08 crc kubenswrapper[4687]: I0228 09:42:08.179316 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 28 09:42:09 crc kubenswrapper[4687]: I0228 09:42:09.121219 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3d191c1-f8c8-455f-848c-a3d0a7caaf81","Type":"ContainerStarted","Data":"1f5a5aebbc6dc09566d977938b19d5c3fbb5d71509901c67394ed1d4ce42b645"} Feb 28 09:42:09 crc kubenswrapper[4687]: I0228 09:42:09.136430 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.17159268 podStartE2EDuration="47.136413154s" podCreationTimestamp="2026-02-28 09:41:22 +0000 UTC" firstStartedPulling="2026-02-28 09:41:24.212422485 +0000 UTC m=+2275.902991822" lastFinishedPulling="2026-02-28 09:42:08.177242959 +0000 UTC m=+2319.867812296" observedRunningTime="2026-02-28 09:42:09.132478147 +0000 UTC m=+2320.823047484" watchObservedRunningTime="2026-02-28 09:42:09.136413154 +0000 UTC m=+2320.826982491" Feb 28 09:42:09 crc kubenswrapper[4687]: I0228 09:42:09.656246 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:42:09 crc kubenswrapper[4687]: E0228 09:42:09.657458 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:42:20 crc kubenswrapper[4687]: I0228 09:42:20.656486 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:42:20 crc kubenswrapper[4687]: E0228 09:42:20.657121 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:42:32 crc kubenswrapper[4687]: I0228 09:42:32.657939 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:42:32 crc kubenswrapper[4687]: E0228 09:42:32.658648 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:42:43 crc kubenswrapper[4687]: I0228 09:42:43.397870 4687 scope.go:117] "RemoveContainer" containerID="c556349e72ced23123967647159678349d39af8ed507304146ffc55a17f4f35a" Feb 28 09:42:47 crc kubenswrapper[4687]: I0228 09:42:47.656383 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:42:47 crc kubenswrapper[4687]: E0228 09:42:47.657173 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:42:58 crc kubenswrapper[4687]: I0228 09:42:58.660913 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:42:58 crc kubenswrapper[4687]: E0228 09:42:58.661722 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:43:11 crc kubenswrapper[4687]: I0228 09:43:11.657459 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:43:11 crc kubenswrapper[4687]: E0228 09:43:11.658158 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:43:24 crc kubenswrapper[4687]: I0228 09:43:24.657764 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:43:24 crc kubenswrapper[4687]: E0228 09:43:24.658595 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:43:39 crc kubenswrapper[4687]: I0228 09:43:39.657220 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:43:39 crc kubenswrapper[4687]: E0228 09:43:39.658620 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:43:54 crc kubenswrapper[4687]: I0228 09:43:54.656777 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:43:54 crc kubenswrapper[4687]: E0228 09:43:54.657466 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.141745 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537864-sslns"] Feb 28 09:44:00 crc kubenswrapper[4687]: E0228 09:44:00.142730 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56adcdd4-04e5-427f-a293-561e788041fb" containerName="oc" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.142741 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="56adcdd4-04e5-427f-a293-561e788041fb" containerName="oc" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.142924 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="56adcdd4-04e5-427f-a293-561e788041fb" containerName="oc" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.143510 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.148572 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.149128 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.149213 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.160186 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537864-sslns"] Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.280083 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gzw\" (UniqueName: \"kubernetes.io/projected/d6299991-2d4b-4e15-93e5-4fc11d251557-kube-api-access-b8gzw\") pod \"auto-csr-approver-29537864-sslns\" (UID: \"d6299991-2d4b-4e15-93e5-4fc11d251557\") " pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.381810 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gzw\" (UniqueName: \"kubernetes.io/projected/d6299991-2d4b-4e15-93e5-4fc11d251557-kube-api-access-b8gzw\") pod \"auto-csr-approver-29537864-sslns\" (UID: \"d6299991-2d4b-4e15-93e5-4fc11d251557\") " pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.399225 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gzw\" (UniqueName: \"kubernetes.io/projected/d6299991-2d4b-4e15-93e5-4fc11d251557-kube-api-access-b8gzw\") pod \"auto-csr-approver-29537864-sslns\" (UID: \"d6299991-2d4b-4e15-93e5-4fc11d251557\") " pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.456582 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.920063 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537864-sslns"] Feb 28 09:44:00 crc kubenswrapper[4687]: I0228 09:44:00.934232 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:44:01 crc kubenswrapper[4687]: I0228 09:44:01.916112 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537864-sslns" event={"ID":"d6299991-2d4b-4e15-93e5-4fc11d251557","Type":"ContainerStarted","Data":"4385b433f75c30dd2a3a2ab3c51ee7ec3b6416a9198298b10348ec7f190a10db"} Feb 28 09:44:02 crc kubenswrapper[4687]: I0228 09:44:02.924759 4687 generic.go:334] "Generic (PLEG): container finished" podID="d6299991-2d4b-4e15-93e5-4fc11d251557" containerID="bf72747927e3e81008044df684b9cdf57d54632515a57c81695c59b4974c19b6" exitCode=0 Feb 28 09:44:02 crc kubenswrapper[4687]: I0228 09:44:02.924864 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537864-sslns" event={"ID":"d6299991-2d4b-4e15-93e5-4fc11d251557","Type":"ContainerDied","Data":"bf72747927e3e81008044df684b9cdf57d54632515a57c81695c59b4974c19b6"} Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.214447 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.356637 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8gzw\" (UniqueName: \"kubernetes.io/projected/d6299991-2d4b-4e15-93e5-4fc11d251557-kube-api-access-b8gzw\") pod \"d6299991-2d4b-4e15-93e5-4fc11d251557\" (UID: \"d6299991-2d4b-4e15-93e5-4fc11d251557\") " Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.361562 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6299991-2d4b-4e15-93e5-4fc11d251557-kube-api-access-b8gzw" (OuterVolumeSpecName: "kube-api-access-b8gzw") pod "d6299991-2d4b-4e15-93e5-4fc11d251557" (UID: "d6299991-2d4b-4e15-93e5-4fc11d251557"). InnerVolumeSpecName "kube-api-access-b8gzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.458913 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8gzw\" (UniqueName: \"kubernetes.io/projected/d6299991-2d4b-4e15-93e5-4fc11d251557-kube-api-access-b8gzw\") on node \"crc\" DevicePath \"\"" Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.940359 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537864-sslns" event={"ID":"d6299991-2d4b-4e15-93e5-4fc11d251557","Type":"ContainerDied","Data":"4385b433f75c30dd2a3a2ab3c51ee7ec3b6416a9198298b10348ec7f190a10db"} Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.940652 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4385b433f75c30dd2a3a2ab3c51ee7ec3b6416a9198298b10348ec7f190a10db" Feb 28 09:44:04 crc kubenswrapper[4687]: I0228 09:44:04.940708 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537864-sslns" Feb 28 09:44:05 crc kubenswrapper[4687]: I0228 09:44:05.259291 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537858-prsjl"] Feb 28 09:44:05 crc kubenswrapper[4687]: I0228 09:44:05.266096 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537858-prsjl"] Feb 28 09:44:06 crc kubenswrapper[4687]: I0228 09:44:06.663748 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e761c01a-dc8b-4439-8539-e65e64d6c8bb" path="/var/lib/kubelet/pods/e761c01a-dc8b-4439-8539-e65e64d6c8bb/volumes" Feb 28 09:44:07 crc kubenswrapper[4687]: I0228 09:44:07.656916 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:44:07 crc kubenswrapper[4687]: E0228 09:44:07.657616 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:44:21 crc kubenswrapper[4687]: I0228 09:44:21.657163 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:44:21 crc kubenswrapper[4687]: E0228 09:44:21.658311 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:44:36 crc kubenswrapper[4687]: I0228 09:44:36.662299 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:44:36 crc kubenswrapper[4687]: E0228 09:44:36.663139 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:44:43 crc kubenswrapper[4687]: I0228 09:44:43.464894 4687 scope.go:117] "RemoveContainer" containerID="3142e563c901316ed2aea43a72ef45305de960166165e018b8112e00fae7adc9" Feb 28 09:44:51 crc kubenswrapper[4687]: I0228 09:44:51.657450 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:44:51 crc kubenswrapper[4687]: E0228 09:44:51.658508 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.135180 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68"] Feb 28 09:45:00 crc kubenswrapper[4687]: E0228 09:45:00.136325 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6299991-2d4b-4e15-93e5-4fc11d251557" containerName="oc" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.136338 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6299991-2d4b-4e15-93e5-4fc11d251557" containerName="oc" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.136523 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6299991-2d4b-4e15-93e5-4fc11d251557" containerName="oc" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.137128 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.141478 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.141690 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.150201 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68"] Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.260564 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sj8d\" (UniqueName: \"kubernetes.io/projected/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-kube-api-access-8sj8d\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.260711 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-secret-volume\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.260891 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-config-volume\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.362360 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-config-volume\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.362411 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sj8d\" (UniqueName: \"kubernetes.io/projected/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-kube-api-access-8sj8d\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.362468 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-secret-volume\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.363208 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-config-volume\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.368508 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-secret-volume\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.374982 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sj8d\" (UniqueName: \"kubernetes.io/projected/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-kube-api-access-8sj8d\") pod \"collect-profiles-29537865-wft68\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:00 crc kubenswrapper[4687]: I0228 09:45:00.459856 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:01 crc kubenswrapper[4687]: I0228 09:45:00.848856 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68"] Feb 28 09:45:01 crc kubenswrapper[4687]: I0228 09:45:01.323501 4687 generic.go:334] "Generic (PLEG): container finished" podID="ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" containerID="d27428dad42ab1273cd984d036e7e3920081c8536fe043d7ed0a03bbe1ecf9e9" exitCode=0 Feb 28 09:45:01 crc kubenswrapper[4687]: I0228 09:45:01.323610 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" event={"ID":"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd","Type":"ContainerDied","Data":"d27428dad42ab1273cd984d036e7e3920081c8536fe043d7ed0a03bbe1ecf9e9"} Feb 28 09:45:01 crc kubenswrapper[4687]: I0228 09:45:01.323847 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" event={"ID":"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd","Type":"ContainerStarted","Data":"c6b8374e61a285fe0a73b0e7716839b1c198b184ee560f8db05fdb7b6af3c2f0"} Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.620572 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.710810 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-secret-volume\") pod \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.711615 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sj8d\" (UniqueName: \"kubernetes.io/projected/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-kube-api-access-8sj8d\") pod \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.711945 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-config-volume\") pod \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\" (UID: \"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd\") " Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.712451 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" (UID: "ca7e9ee9-5b1f-4e41-a0c8-6581557076bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.716435 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-kube-api-access-8sj8d" (OuterVolumeSpecName: "kube-api-access-8sj8d") pod "ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" (UID: "ca7e9ee9-5b1f-4e41-a0c8-6581557076bd"). InnerVolumeSpecName "kube-api-access-8sj8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.716575 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" (UID: "ca7e9ee9-5b1f-4e41-a0c8-6581557076bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.814796 4687 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.814826 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sj8d\" (UniqueName: \"kubernetes.io/projected/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-kube-api-access-8sj8d\") on node \"crc\" DevicePath \"\"" Feb 28 09:45:02 crc kubenswrapper[4687]: I0228 09:45:02.814836 4687 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca7e9ee9-5b1f-4e41-a0c8-6581557076bd-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 09:45:03 crc kubenswrapper[4687]: I0228 09:45:03.337120 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" event={"ID":"ca7e9ee9-5b1f-4e41-a0c8-6581557076bd","Type":"ContainerDied","Data":"c6b8374e61a285fe0a73b0e7716839b1c198b184ee560f8db05fdb7b6af3c2f0"} Feb 28 09:45:03 crc kubenswrapper[4687]: I0228 09:45:03.337413 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b8374e61a285fe0a73b0e7716839b1c198b184ee560f8db05fdb7b6af3c2f0" Feb 28 09:45:03 crc kubenswrapper[4687]: I0228 09:45:03.337171 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537865-wft68" Feb 28 09:45:03 crc kubenswrapper[4687]: I0228 09:45:03.656396 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:45:03 crc kubenswrapper[4687]: E0228 09:45:03.657449 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:45:03 crc kubenswrapper[4687]: I0228 09:45:03.680807 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29"] Feb 28 09:45:03 crc kubenswrapper[4687]: I0228 09:45:03.687608 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537820-r5c29"] Feb 28 09:45:04 crc kubenswrapper[4687]: I0228 09:45:04.666383 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e899d87a-f034-4436-8409-ca04178918b7" path="/var/lib/kubelet/pods/e899d87a-f034-4436-8409-ca04178918b7/volumes" Feb 28 09:45:16 crc kubenswrapper[4687]: I0228 09:45:16.656836 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:45:16 crc kubenswrapper[4687]: E0228 09:45:16.657710 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:45:28 crc kubenswrapper[4687]: I0228 09:45:28.664774 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:45:29 crc kubenswrapper[4687]: I0228 09:45:29.580054 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"193f4e131507074613a20b8d12c9de80ed9e99fe06c33cfd5df2585fad845b32"} Feb 28 09:45:43 crc kubenswrapper[4687]: I0228 09:45:43.522214 4687 scope.go:117] "RemoveContainer" containerID="a60b310cdec6733ac239fb8523ff17cee642452cef4858be568fedb8075066e1" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.138594 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537866-fpbzs"] Feb 28 09:46:00 crc kubenswrapper[4687]: E0228 09:46:00.140788 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" containerName="collect-profiles" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.140809 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" containerName="collect-profiles" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.141003 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7e9ee9-5b1f-4e41-a0c8-6581557076bd" containerName="collect-profiles" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.141820 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.143367 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.143702 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.145346 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537866-fpbzs"] Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.149620 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.324233 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5fg\" (UniqueName: \"kubernetes.io/projected/ecbba413-398b-4e6d-9f27-d7c3ce6bed48-kube-api-access-lz5fg\") pod \"auto-csr-approver-29537866-fpbzs\" (UID: \"ecbba413-398b-4e6d-9f27-d7c3ce6bed48\") " pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.427634 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5fg\" (UniqueName: \"kubernetes.io/projected/ecbba413-398b-4e6d-9f27-d7c3ce6bed48-kube-api-access-lz5fg\") pod \"auto-csr-approver-29537866-fpbzs\" (UID: \"ecbba413-398b-4e6d-9f27-d7c3ce6bed48\") " pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.447156 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5fg\" (UniqueName: \"kubernetes.io/projected/ecbba413-398b-4e6d-9f27-d7c3ce6bed48-kube-api-access-lz5fg\") pod \"auto-csr-approver-29537866-fpbzs\" (UID: \"ecbba413-398b-4e6d-9f27-d7c3ce6bed48\") " pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.468335 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:00 crc kubenswrapper[4687]: I0228 09:46:00.856672 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537866-fpbzs"] Feb 28 09:46:01 crc kubenswrapper[4687]: I0228 09:46:01.860061 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" event={"ID":"ecbba413-398b-4e6d-9f27-d7c3ce6bed48","Type":"ContainerStarted","Data":"520552e965c0433d1a9c64d84c8116f7edfae906c2cb07952dba9ef6cf2ca780"} Feb 28 09:46:02 crc kubenswrapper[4687]: I0228 09:46:02.871140 4687 generic.go:334] "Generic (PLEG): container finished" podID="ecbba413-398b-4e6d-9f27-d7c3ce6bed48" containerID="ab8d80b72b56294558989f4e080d7f54740dd00a34a2db26c58f59ea249314ea" exitCode=0 Feb 28 09:46:02 crc kubenswrapper[4687]: I0228 09:46:02.871260 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" event={"ID":"ecbba413-398b-4e6d-9f27-d7c3ce6bed48","Type":"ContainerDied","Data":"ab8d80b72b56294558989f4e080d7f54740dd00a34a2db26c58f59ea249314ea"} Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.216091 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.401747 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz5fg\" (UniqueName: \"kubernetes.io/projected/ecbba413-398b-4e6d-9f27-d7c3ce6bed48-kube-api-access-lz5fg\") pod \"ecbba413-398b-4e6d-9f27-d7c3ce6bed48\" (UID: \"ecbba413-398b-4e6d-9f27-d7c3ce6bed48\") " Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.408058 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbba413-398b-4e6d-9f27-d7c3ce6bed48-kube-api-access-lz5fg" (OuterVolumeSpecName: "kube-api-access-lz5fg") pod "ecbba413-398b-4e6d-9f27-d7c3ce6bed48" (UID: "ecbba413-398b-4e6d-9f27-d7c3ce6bed48"). InnerVolumeSpecName "kube-api-access-lz5fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.504288 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz5fg\" (UniqueName: \"kubernetes.io/projected/ecbba413-398b-4e6d-9f27-d7c3ce6bed48-kube-api-access-lz5fg\") on node \"crc\" DevicePath \"\"" Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.888690 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" event={"ID":"ecbba413-398b-4e6d-9f27-d7c3ce6bed48","Type":"ContainerDied","Data":"520552e965c0433d1a9c64d84c8116f7edfae906c2cb07952dba9ef6cf2ca780"} Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.888751 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="520552e965c0433d1a9c64d84c8116f7edfae906c2cb07952dba9ef6cf2ca780" Feb 28 09:46:04 crc kubenswrapper[4687]: I0228 09:46:04.888834 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537866-fpbzs" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.280202 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537860-slrkh"] Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.286847 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537860-slrkh"] Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.610001 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jgl5w"] Feb 28 09:46:05 crc kubenswrapper[4687]: E0228 09:46:05.610686 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbba413-398b-4e6d-9f27-d7c3ce6bed48" containerName="oc" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.610778 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbba413-398b-4e6d-9f27-d7c3ce6bed48" containerName="oc" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.611143 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbba413-398b-4e6d-9f27-d7c3ce6bed48" containerName="oc" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.614825 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.621445 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgl5w"] Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.637664 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-catalog-content\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.637790 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d654l\" (UniqueName: \"kubernetes.io/projected/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-kube-api-access-d654l\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.637905 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-utilities\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.740176 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-utilities\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.740280 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-catalog-content\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.740349 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d654l\" (UniqueName: \"kubernetes.io/projected/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-kube-api-access-d654l\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.740757 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-utilities\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.740757 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-catalog-content\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.757803 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d654l\" (UniqueName: \"kubernetes.io/projected/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-kube-api-access-d654l\") pod \"redhat-marketplace-jgl5w\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:05 crc kubenswrapper[4687]: I0228 09:46:05.938277 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:06 crc kubenswrapper[4687]: I0228 09:46:06.407971 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgl5w"] Feb 28 09:46:06 crc kubenswrapper[4687]: I0228 09:46:06.667770 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8df979-76aa-443b-a961-7a0f1252e386" path="/var/lib/kubelet/pods/4e8df979-76aa-443b-a961-7a0f1252e386/volumes" Feb 28 09:46:06 crc kubenswrapper[4687]: I0228 09:46:06.909270 4687 generic.go:334] "Generic (PLEG): container finished" podID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerID="3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893" exitCode=0 Feb 28 09:46:06 crc kubenswrapper[4687]: I0228 09:46:06.909329 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerDied","Data":"3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893"} Feb 28 09:46:06 crc kubenswrapper[4687]: I0228 09:46:06.909363 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerStarted","Data":"4e54d657c28669d497df382a017680ff453e8c6c1a9879f128291e89149341ce"} Feb 28 09:46:07 crc kubenswrapper[4687]: I0228 09:46:07.920669 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerStarted","Data":"0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22"} Feb 28 09:46:08 crc kubenswrapper[4687]: I0228 09:46:08.933404 4687 generic.go:334] "Generic (PLEG): container finished" podID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerID="0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22" exitCode=0 Feb 28 09:46:08 crc kubenswrapper[4687]: I0228 09:46:08.933486 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerDied","Data":"0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22"} Feb 28 09:46:09 crc kubenswrapper[4687]: I0228 09:46:09.942423 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerStarted","Data":"3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157"} Feb 28 09:46:09 crc kubenswrapper[4687]: I0228 09:46:09.986503 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jgl5w" podStartSLOduration=2.4036875220000002 podStartE2EDuration="4.986479965s" podCreationTimestamp="2026-02-28 09:46:05 +0000 UTC" firstStartedPulling="2026-02-28 09:46:06.912350141 +0000 UTC m=+2558.602919478" lastFinishedPulling="2026-02-28 09:46:09.495142584 +0000 UTC m=+2561.185711921" observedRunningTime="2026-02-28 09:46:09.971788057 +0000 UTC m=+2561.662357414" watchObservedRunningTime="2026-02-28 09:46:09.986479965 +0000 UTC m=+2561.677049302" Feb 28 09:46:15 crc kubenswrapper[4687]: I0228 09:46:15.938779 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:15 crc kubenswrapper[4687]: I0228 09:46:15.939382 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:15 crc kubenswrapper[4687]: I0228 09:46:15.977898 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:16 crc kubenswrapper[4687]: I0228 09:46:16.032337 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:17 crc kubenswrapper[4687]: I0228 09:46:17.997429 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgl5w"] Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.007439 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jgl5w" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="registry-server" containerID="cri-o://3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157" gracePeriod=2 Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.443627 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.587289 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-catalog-content\") pod \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.587931 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d654l\" (UniqueName: \"kubernetes.io/projected/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-kube-api-access-d654l\") pod \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.587992 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-utilities\") pod \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\" (UID: \"bc5581d1-6dfc-47d7-8918-c3112e5a79ad\") " Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.588604 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-utilities" (OuterVolumeSpecName: "utilities") pod "bc5581d1-6dfc-47d7-8918-c3112e5a79ad" (UID: "bc5581d1-6dfc-47d7-8918-c3112e5a79ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.597166 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-kube-api-access-d654l" (OuterVolumeSpecName: "kube-api-access-d654l") pod "bc5581d1-6dfc-47d7-8918-c3112e5a79ad" (UID: "bc5581d1-6dfc-47d7-8918-c3112e5a79ad"). InnerVolumeSpecName "kube-api-access-d654l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.610036 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc5581d1-6dfc-47d7-8918-c3112e5a79ad" (UID: "bc5581d1-6dfc-47d7-8918-c3112e5a79ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.689999 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.690048 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d654l\" (UniqueName: \"kubernetes.io/projected/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-kube-api-access-d654l\") on node \"crc\" DevicePath \"\"" Feb 28 09:46:18 crc kubenswrapper[4687]: I0228 09:46:18.690061 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5581d1-6dfc-47d7-8918-c3112e5a79ad-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.018760 4687 generic.go:334] "Generic (PLEG): container finished" podID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerID="3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157" exitCode=0 Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.018825 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerDied","Data":"3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157"} Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.018859 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgl5w" event={"ID":"bc5581d1-6dfc-47d7-8918-c3112e5a79ad","Type":"ContainerDied","Data":"4e54d657c28669d497df382a017680ff453e8c6c1a9879f128291e89149341ce"} Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.018891 4687 scope.go:117] "RemoveContainer" containerID="3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.019077 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgl5w" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.038301 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgl5w"] Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.039476 4687 scope.go:117] "RemoveContainer" containerID="0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.046707 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgl5w"] Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.059357 4687 scope.go:117] "RemoveContainer" containerID="3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.107858 4687 scope.go:117] "RemoveContainer" containerID="3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157" Feb 28 09:46:19 crc kubenswrapper[4687]: E0228 09:46:19.108768 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157\": container with ID starting with 3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157 not found: ID does not exist" containerID="3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.108814 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157"} err="failed to get container status \"3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157\": rpc error: code = NotFound desc = could not find container \"3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157\": container with ID starting with 3a2f40402ed8f3f77af910a46451e5d90e49238dd908002ef824bba6762f2157 not found: ID does not exist" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.108843 4687 scope.go:117] "RemoveContainer" containerID="0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22" Feb 28 09:46:19 crc kubenswrapper[4687]: E0228 09:46:19.109713 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22\": container with ID starting with 0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22 not found: ID does not exist" containerID="0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.109775 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22"} err="failed to get container status \"0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22\": rpc error: code = NotFound desc = could not find container \"0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22\": container with ID starting with 0a1fd03653c572cce22a6940fda3557901f7735e650b9fc42ff5e528979d7d22 not found: ID does not exist" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.109833 4687 scope.go:117] "RemoveContainer" containerID="3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893" Feb 28 09:46:19 crc kubenswrapper[4687]: E0228 09:46:19.110518 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893\": container with ID starting with 3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893 not found: ID does not exist" containerID="3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893" Feb 28 09:46:19 crc kubenswrapper[4687]: I0228 09:46:19.110567 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893"} err="failed to get container status \"3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893\": rpc error: code = NotFound desc = could not find container \"3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893\": container with ID starting with 3260a6552f58c8bde5a4002f1885d04752d4bb92364544938b4ba1013393b893 not found: ID does not exist" Feb 28 09:46:20 crc kubenswrapper[4687]: I0228 09:46:20.667576 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" path="/var/lib/kubelet/pods/bc5581d1-6dfc-47d7-8918-c3112e5a79ad/volumes" Feb 28 09:46:43 crc kubenswrapper[4687]: I0228 09:46:43.587169 4687 scope.go:117] "RemoveContainer" containerID="1c13cdb34ab669145698e419324ec5c2e3bd745e94064c63d5c4c651f0515c25" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.848681 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gl7sn"] Feb 28 09:47:35 crc kubenswrapper[4687]: E0228 09:47:35.849576 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="extract-utilities" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.849589 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="extract-utilities" Feb 28 09:47:35 crc kubenswrapper[4687]: E0228 09:47:35.849611 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="registry-server" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.849617 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="registry-server" Feb 28 09:47:35 crc kubenswrapper[4687]: E0228 09:47:35.849641 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="extract-content" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.849648 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="extract-content" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.849809 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5581d1-6dfc-47d7-8918-c3112e5a79ad" containerName="registry-server" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.850904 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:35 crc kubenswrapper[4687]: I0228 09:47:35.867675 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gl7sn"] Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.008208 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-utilities\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.008356 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-catalog-content\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.008414 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsn6x\" (UniqueName: \"kubernetes.io/projected/fc313272-b852-4a9f-b844-3594d9da1900-kube-api-access-xsn6x\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.111188 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsn6x\" (UniqueName: \"kubernetes.io/projected/fc313272-b852-4a9f-b844-3594d9da1900-kube-api-access-xsn6x\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.111318 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-utilities\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.111376 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-catalog-content\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.111885 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-utilities\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.111922 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-catalog-content\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.133029 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsn6x\" (UniqueName: \"kubernetes.io/projected/fc313272-b852-4a9f-b844-3594d9da1900-kube-api-access-xsn6x\") pod \"certified-operators-gl7sn\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.166281 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.629085 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gl7sn"] Feb 28 09:47:36 crc kubenswrapper[4687]: I0228 09:47:36.712554 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerStarted","Data":"aa793cd6cab3134cdb578501a752872277603a2a1c32c2b6af57e654a8c1550c"} Feb 28 09:47:37 crc kubenswrapper[4687]: I0228 09:47:37.726432 4687 generic.go:334] "Generic (PLEG): container finished" podID="fc313272-b852-4a9f-b844-3594d9da1900" containerID="ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b" exitCode=0 Feb 28 09:47:37 crc kubenswrapper[4687]: I0228 09:47:37.726549 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerDied","Data":"ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b"} Feb 28 09:47:38 crc kubenswrapper[4687]: I0228 09:47:38.736264 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerStarted","Data":"e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00"} Feb 28 09:47:39 crc kubenswrapper[4687]: I0228 09:47:39.747957 4687 generic.go:334] "Generic (PLEG): container finished" podID="fc313272-b852-4a9f-b844-3594d9da1900" containerID="e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00" exitCode=0 Feb 28 09:47:39 crc kubenswrapper[4687]: I0228 09:47:39.748055 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerDied","Data":"e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00"} Feb 28 09:47:40 crc kubenswrapper[4687]: I0228 09:47:40.762189 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerStarted","Data":"b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1"} Feb 28 09:47:40 crc kubenswrapper[4687]: I0228 09:47:40.788398 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gl7sn" podStartSLOduration=3.300808069 podStartE2EDuration="5.788382154s" podCreationTimestamp="2026-02-28 09:47:35 +0000 UTC" firstStartedPulling="2026-02-28 09:47:37.728891392 +0000 UTC m=+2649.419460730" lastFinishedPulling="2026-02-28 09:47:40.216465477 +0000 UTC m=+2651.907034815" observedRunningTime="2026-02-28 09:47:40.782133679 +0000 UTC m=+2652.472703016" watchObservedRunningTime="2026-02-28 09:47:40.788382154 +0000 UTC m=+2652.478951481" Feb 28 09:47:46 crc kubenswrapper[4687]: I0228 09:47:46.166656 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:46 crc kubenswrapper[4687]: I0228 09:47:46.167461 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:46 crc kubenswrapper[4687]: I0228 09:47:46.209943 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:46 crc kubenswrapper[4687]: I0228 09:47:46.855569 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:46 crc kubenswrapper[4687]: I0228 09:47:46.894850 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gl7sn"] Feb 28 09:47:48 crc kubenswrapper[4687]: I0228 09:47:48.840048 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gl7sn" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="registry-server" containerID="cri-o://b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1" gracePeriod=2 Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.300844 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.484984 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-catalog-content\") pod \"fc313272-b852-4a9f-b844-3594d9da1900\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.485137 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsn6x\" (UniqueName: \"kubernetes.io/projected/fc313272-b852-4a9f-b844-3594d9da1900-kube-api-access-xsn6x\") pod \"fc313272-b852-4a9f-b844-3594d9da1900\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.485376 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-utilities\") pod \"fc313272-b852-4a9f-b844-3594d9da1900\" (UID: \"fc313272-b852-4a9f-b844-3594d9da1900\") " Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.485849 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-utilities" (OuterVolumeSpecName: "utilities") pod "fc313272-b852-4a9f-b844-3594d9da1900" (UID: "fc313272-b852-4a9f-b844-3594d9da1900"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.491241 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc313272-b852-4a9f-b844-3594d9da1900-kube-api-access-xsn6x" (OuterVolumeSpecName: "kube-api-access-xsn6x") pod "fc313272-b852-4a9f-b844-3594d9da1900" (UID: "fc313272-b852-4a9f-b844-3594d9da1900"). InnerVolumeSpecName "kube-api-access-xsn6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.528870 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc313272-b852-4a9f-b844-3594d9da1900" (UID: "fc313272-b852-4a9f-b844-3594d9da1900"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.587554 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.587588 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsn6x\" (UniqueName: \"kubernetes.io/projected/fc313272-b852-4a9f-b844-3594d9da1900-kube-api-access-xsn6x\") on node \"crc\" DevicePath \"\"" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.587601 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc313272-b852-4a9f-b844-3594d9da1900-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.852055 4687 generic.go:334] "Generic (PLEG): container finished" podID="fc313272-b852-4a9f-b844-3594d9da1900" containerID="b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1" exitCode=0 Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.852105 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerDied","Data":"b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1"} Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.852137 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gl7sn" event={"ID":"fc313272-b852-4a9f-b844-3594d9da1900","Type":"ContainerDied","Data":"aa793cd6cab3134cdb578501a752872277603a2a1c32c2b6af57e654a8c1550c"} Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.852158 4687 scope.go:117] "RemoveContainer" containerID="b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.852176 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gl7sn" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.881187 4687 scope.go:117] "RemoveContainer" containerID="e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.887998 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gl7sn"] Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.897712 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gl7sn"] Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.901625 4687 scope.go:117] "RemoveContainer" containerID="ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.933512 4687 scope.go:117] "RemoveContainer" containerID="b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1" Feb 28 09:47:49 crc kubenswrapper[4687]: E0228 09:47:49.933833 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1\": container with ID starting with b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1 not found: ID does not exist" containerID="b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.933870 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1"} err="failed to get container status \"b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1\": rpc error: code = NotFound desc = could not find container \"b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1\": container with ID starting with b7c993bdb3b1c82b066a2df4d7f5a4a85ce904ac6e480009f3a6a479620dfec1 not found: ID does not exist" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.933906 4687 scope.go:117] "RemoveContainer" containerID="e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00" Feb 28 09:47:49 crc kubenswrapper[4687]: E0228 09:47:49.934230 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00\": container with ID starting with e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00 not found: ID does not exist" containerID="e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.934277 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00"} err="failed to get container status \"e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00\": rpc error: code = NotFound desc = could not find container \"e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00\": container with ID starting with e88fdbfe584161bc6c85f1e2e34c7142002e82b9f6f2c02f2b9255ff2957be00 not found: ID does not exist" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.934311 4687 scope.go:117] "RemoveContainer" containerID="ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b" Feb 28 09:47:49 crc kubenswrapper[4687]: E0228 09:47:49.934584 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b\": container with ID starting with ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b not found: ID does not exist" containerID="ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b" Feb 28 09:47:49 crc kubenswrapper[4687]: I0228 09:47:49.934618 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b"} err="failed to get container status \"ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b\": rpc error: code = NotFound desc = could not find container \"ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b\": container with ID starting with ffd460aad5db54eff9e1a91fa4820c8328beb141345e86d37635711d5f7e725b not found: ID does not exist" Feb 28 09:47:50 crc kubenswrapper[4687]: I0228 09:47:50.668860 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc313272-b852-4a9f-b844-3594d9da1900" path="/var/lib/kubelet/pods/fc313272-b852-4a9f-b844-3594d9da1900/volumes" Feb 28 09:47:55 crc kubenswrapper[4687]: I0228 09:47:55.002452 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:47:55 crc kubenswrapper[4687]: I0228 09:47:55.003243 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.250970 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-46xwg"] Feb 28 09:47:58 crc kubenswrapper[4687]: E0228 09:47:58.252638 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="extract-utilities" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.252717 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="extract-utilities" Feb 28 09:47:58 crc kubenswrapper[4687]: E0228 09:47:58.252811 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="extract-content" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.252877 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="extract-content" Feb 28 09:47:58 crc kubenswrapper[4687]: E0228 09:47:58.252959 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="registry-server" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.253010 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="registry-server" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.253330 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc313272-b852-4a9f-b844-3594d9da1900" containerName="registry-server" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.255159 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.262984 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46xwg"] Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.390908 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8pd\" (UniqueName: \"kubernetes.io/projected/c839b8ea-8413-4328-9432-8473e432f1d0-kube-api-access-dc8pd\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.391002 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-catalog-content\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.391127 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-utilities\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.493677 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-utilities\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.494067 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8pd\" (UniqueName: \"kubernetes.io/projected/c839b8ea-8413-4328-9432-8473e432f1d0-kube-api-access-dc8pd\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.494160 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-catalog-content\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.495548 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-utilities\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.495583 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-catalog-content\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.517245 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8pd\" (UniqueName: \"kubernetes.io/projected/c839b8ea-8413-4328-9432-8473e432f1d0-kube-api-access-dc8pd\") pod \"redhat-operators-46xwg\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:58 crc kubenswrapper[4687]: I0228 09:47:58.573430 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:47:59 crc kubenswrapper[4687]: I0228 09:47:59.023056 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46xwg"] Feb 28 09:47:59 crc kubenswrapper[4687]: I0228 09:47:59.940120 4687 generic.go:334] "Generic (PLEG): container finished" podID="c839b8ea-8413-4328-9432-8473e432f1d0" containerID="019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa" exitCode=0 Feb 28 09:47:59 crc kubenswrapper[4687]: I0228 09:47:59.940169 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerDied","Data":"019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa"} Feb 28 09:47:59 crc kubenswrapper[4687]: I0228 09:47:59.940535 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerStarted","Data":"90c7aaabb6cc8c64e14dbea902b2da5dc1941053d280251531d62e6cbeecc9dd"} Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.143143 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537868-5wdhp"] Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.144807 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.148231 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.148734 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537868-5wdhp"] Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.149383 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.149392 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.234164 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhzhx\" (UniqueName: \"kubernetes.io/projected/421812d0-9afe-48ff-a4e1-6909ebb201d0-kube-api-access-zhzhx\") pod \"auto-csr-approver-29537868-5wdhp\" (UID: \"421812d0-9afe-48ff-a4e1-6909ebb201d0\") " pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.335476 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhzhx\" (UniqueName: \"kubernetes.io/projected/421812d0-9afe-48ff-a4e1-6909ebb201d0-kube-api-access-zhzhx\") pod \"auto-csr-approver-29537868-5wdhp\" (UID: \"421812d0-9afe-48ff-a4e1-6909ebb201d0\") " pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.359766 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhzhx\" (UniqueName: \"kubernetes.io/projected/421812d0-9afe-48ff-a4e1-6909ebb201d0-kube-api-access-zhzhx\") pod \"auto-csr-approver-29537868-5wdhp\" (UID: \"421812d0-9afe-48ff-a4e1-6909ebb201d0\") " pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.464131 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.871979 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537868-5wdhp"] Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.952108 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerStarted","Data":"565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea"} Feb 28 09:48:00 crc kubenswrapper[4687]: I0228 09:48:00.953922 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" event={"ID":"421812d0-9afe-48ff-a4e1-6909ebb201d0","Type":"ContainerStarted","Data":"a2597fa207e6cb2aaf6c4ed6a1c32c16675459f61f4ca49fe22ce4fc2d3338a8"} Feb 28 09:48:01 crc kubenswrapper[4687]: I0228 09:48:01.965208 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" event={"ID":"421812d0-9afe-48ff-a4e1-6909ebb201d0","Type":"ContainerStarted","Data":"f12a1dd6db244429dff6440cf79ed1eb28ad978284b82f66c13b53398d5692f7"} Feb 28 09:48:02 crc kubenswrapper[4687]: I0228 09:48:02.975734 4687 generic.go:334] "Generic (PLEG): container finished" podID="c839b8ea-8413-4328-9432-8473e432f1d0" containerID="565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea" exitCode=0 Feb 28 09:48:02 crc kubenswrapper[4687]: I0228 09:48:02.975830 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerDied","Data":"565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea"} Feb 28 09:48:02 crc kubenswrapper[4687]: I0228 09:48:02.978206 4687 generic.go:334] "Generic (PLEG): container finished" podID="421812d0-9afe-48ff-a4e1-6909ebb201d0" containerID="f12a1dd6db244429dff6440cf79ed1eb28ad978284b82f66c13b53398d5692f7" exitCode=0 Feb 28 09:48:02 crc kubenswrapper[4687]: I0228 09:48:02.978283 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" event={"ID":"421812d0-9afe-48ff-a4e1-6909ebb201d0","Type":"ContainerDied","Data":"f12a1dd6db244429dff6440cf79ed1eb28ad978284b82f66c13b53398d5692f7"} Feb 28 09:48:03 crc kubenswrapper[4687]: I0228 09:48:03.989669 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerStarted","Data":"07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30"} Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.013870 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-46xwg" podStartSLOduration=2.494270015 podStartE2EDuration="6.01385318s" podCreationTimestamp="2026-02-28 09:47:58 +0000 UTC" firstStartedPulling="2026-02-28 09:47:59.941735927 +0000 UTC m=+2671.632305264" lastFinishedPulling="2026-02-28 09:48:03.461319092 +0000 UTC m=+2675.151888429" observedRunningTime="2026-02-28 09:48:04.006859929 +0000 UTC m=+2675.697429256" watchObservedRunningTime="2026-02-28 09:48:04.01385318 +0000 UTC m=+2675.704422507" Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.334619 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.533354 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhzhx\" (UniqueName: \"kubernetes.io/projected/421812d0-9afe-48ff-a4e1-6909ebb201d0-kube-api-access-zhzhx\") pod \"421812d0-9afe-48ff-a4e1-6909ebb201d0\" (UID: \"421812d0-9afe-48ff-a4e1-6909ebb201d0\") " Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.543541 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421812d0-9afe-48ff-a4e1-6909ebb201d0-kube-api-access-zhzhx" (OuterVolumeSpecName: "kube-api-access-zhzhx") pod "421812d0-9afe-48ff-a4e1-6909ebb201d0" (UID: "421812d0-9afe-48ff-a4e1-6909ebb201d0"). InnerVolumeSpecName "kube-api-access-zhzhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.636761 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhzhx\" (UniqueName: \"kubernetes.io/projected/421812d0-9afe-48ff-a4e1-6909ebb201d0-kube-api-access-zhzhx\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.998517 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" event={"ID":"421812d0-9afe-48ff-a4e1-6909ebb201d0","Type":"ContainerDied","Data":"a2597fa207e6cb2aaf6c4ed6a1c32c16675459f61f4ca49fe22ce4fc2d3338a8"} Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.998564 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537868-5wdhp" Feb 28 09:48:04 crc kubenswrapper[4687]: I0228 09:48:04.998572 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2597fa207e6cb2aaf6c4ed6a1c32c16675459f61f4ca49fe22ce4fc2d3338a8" Feb 28 09:48:05 crc kubenswrapper[4687]: I0228 09:48:05.397451 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537862-mmzzs"] Feb 28 09:48:05 crc kubenswrapper[4687]: I0228 09:48:05.404798 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537862-mmzzs"] Feb 28 09:48:06 crc kubenswrapper[4687]: I0228 09:48:06.667721 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56adcdd4-04e5-427f-a293-561e788041fb" path="/var/lib/kubelet/pods/56adcdd4-04e5-427f-a293-561e788041fb/volumes" Feb 28 09:48:08 crc kubenswrapper[4687]: I0228 09:48:08.573844 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:48:08 crc kubenswrapper[4687]: I0228 09:48:08.574310 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:48:08 crc kubenswrapper[4687]: I0228 09:48:08.611592 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:48:09 crc kubenswrapper[4687]: I0228 09:48:09.073802 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:48:09 crc kubenswrapper[4687]: I0228 09:48:09.112608 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46xwg"] Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.046692 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-46xwg" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="registry-server" containerID="cri-o://07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30" gracePeriod=2 Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.469724 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.473695 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8pd\" (UniqueName: \"kubernetes.io/projected/c839b8ea-8413-4328-9432-8473e432f1d0-kube-api-access-dc8pd\") pod \"c839b8ea-8413-4328-9432-8473e432f1d0\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.492626 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c839b8ea-8413-4328-9432-8473e432f1d0-kube-api-access-dc8pd" (OuterVolumeSpecName: "kube-api-access-dc8pd") pod "c839b8ea-8413-4328-9432-8473e432f1d0" (UID: "c839b8ea-8413-4328-9432-8473e432f1d0"). InnerVolumeSpecName "kube-api-access-dc8pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.575267 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-catalog-content\") pod \"c839b8ea-8413-4328-9432-8473e432f1d0\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.575314 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-utilities\") pod \"c839b8ea-8413-4328-9432-8473e432f1d0\" (UID: \"c839b8ea-8413-4328-9432-8473e432f1d0\") " Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.575670 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8pd\" (UniqueName: \"kubernetes.io/projected/c839b8ea-8413-4328-9432-8473e432f1d0-kube-api-access-dc8pd\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.576081 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-utilities" (OuterVolumeSpecName: "utilities") pod "c839b8ea-8413-4328-9432-8473e432f1d0" (UID: "c839b8ea-8413-4328-9432-8473e432f1d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.671228 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c839b8ea-8413-4328-9432-8473e432f1d0" (UID: "c839b8ea-8413-4328-9432-8473e432f1d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.677547 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:11 crc kubenswrapper[4687]: I0228 09:48:11.677578 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c839b8ea-8413-4328-9432-8473e432f1d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.056452 4687 generic.go:334] "Generic (PLEG): container finished" podID="c839b8ea-8413-4328-9432-8473e432f1d0" containerID="07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30" exitCode=0 Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.056504 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerDied","Data":"07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30"} Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.057711 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46xwg" event={"ID":"c839b8ea-8413-4328-9432-8473e432f1d0","Type":"ContainerDied","Data":"90c7aaabb6cc8c64e14dbea902b2da5dc1941053d280251531d62e6cbeecc9dd"} Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.057775 4687 scope.go:117] "RemoveContainer" containerID="07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.056572 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46xwg" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.085365 4687 scope.go:117] "RemoveContainer" containerID="565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.086562 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46xwg"] Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.094251 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-46xwg"] Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.117587 4687 scope.go:117] "RemoveContainer" containerID="019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.143388 4687 scope.go:117] "RemoveContainer" containerID="07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30" Feb 28 09:48:12 crc kubenswrapper[4687]: E0228 09:48:12.143704 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30\": container with ID starting with 07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30 not found: ID does not exist" containerID="07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.143729 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30"} err="failed to get container status \"07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30\": rpc error: code = NotFound desc = could not find container \"07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30\": container with ID starting with 07f882be94651f9ed7cb13197ff03a00581de559a3f66fc5f4f4335669de7a30 not found: ID does not exist" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.143750 4687 scope.go:117] "RemoveContainer" containerID="565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea" Feb 28 09:48:12 crc kubenswrapper[4687]: E0228 09:48:12.143989 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea\": container with ID starting with 565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea not found: ID does not exist" containerID="565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.144005 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea"} err="failed to get container status \"565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea\": rpc error: code = NotFound desc = could not find container \"565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea\": container with ID starting with 565dbae05775fe0550c817c01f119eb6777bccc4dc881e9ad66c88f7b4fcc2ea not found: ID does not exist" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.144036 4687 scope.go:117] "RemoveContainer" containerID="019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa" Feb 28 09:48:12 crc kubenswrapper[4687]: E0228 09:48:12.144259 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa\": container with ID starting with 019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa not found: ID does not exist" containerID="019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.144291 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa"} err="failed to get container status \"019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa\": rpc error: code = NotFound desc = could not find container \"019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa\": container with ID starting with 019ce6231c8254844b8e23b3f7879674e1c84edf8702266bc97ed5cbeb9194aa not found: ID does not exist" Feb 28 09:48:12 crc kubenswrapper[4687]: I0228 09:48:12.666562 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" path="/var/lib/kubelet/pods/c839b8ea-8413-4328-9432-8473e432f1d0/volumes" Feb 28 09:48:25 crc kubenswrapper[4687]: I0228 09:48:25.002449 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:48:25 crc kubenswrapper[4687]: I0228 09:48:25.003170 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.030467 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sptsb"] Feb 28 09:48:26 crc kubenswrapper[4687]: E0228 09:48:26.031542 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="extract-content" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.031577 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="extract-content" Feb 28 09:48:26 crc kubenswrapper[4687]: E0228 09:48:26.031593 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421812d0-9afe-48ff-a4e1-6909ebb201d0" containerName="oc" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.031601 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="421812d0-9afe-48ff-a4e1-6909ebb201d0" containerName="oc" Feb 28 09:48:26 crc kubenswrapper[4687]: E0228 09:48:26.031622 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="extract-utilities" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.031631 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="extract-utilities" Feb 28 09:48:26 crc kubenswrapper[4687]: E0228 09:48:26.031652 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="registry-server" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.031660 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="registry-server" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.031949 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="c839b8ea-8413-4328-9432-8473e432f1d0" containerName="registry-server" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.031976 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="421812d0-9afe-48ff-a4e1-6909ebb201d0" containerName="oc" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.034049 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.048060 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sptsb"] Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.169748 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsdq\" (UniqueName: \"kubernetes.io/projected/6acd9179-c4a0-409c-8fab-1591bdde7d2f-kube-api-access-wvsdq\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.169787 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-utilities\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.169817 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-catalog-content\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.273053 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsdq\" (UniqueName: \"kubernetes.io/projected/6acd9179-c4a0-409c-8fab-1591bdde7d2f-kube-api-access-wvsdq\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.273118 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-utilities\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.273170 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-catalog-content\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.273946 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-utilities\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.274000 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-catalog-content\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.290380 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsdq\" (UniqueName: \"kubernetes.io/projected/6acd9179-c4a0-409c-8fab-1591bdde7d2f-kube-api-access-wvsdq\") pod \"community-operators-sptsb\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.361544 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:26 crc kubenswrapper[4687]: I0228 09:48:26.858290 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sptsb"] Feb 28 09:48:27 crc kubenswrapper[4687]: I0228 09:48:27.185357 4687 generic.go:334] "Generic (PLEG): container finished" podID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerID="de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960" exitCode=0 Feb 28 09:48:27 crc kubenswrapper[4687]: I0228 09:48:27.185408 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerDied","Data":"de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960"} Feb 28 09:48:27 crc kubenswrapper[4687]: I0228 09:48:27.185731 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerStarted","Data":"e421441508bdb21235abf50025be0adda30242e318d100834db4f21a3100634a"} Feb 28 09:48:28 crc kubenswrapper[4687]: I0228 09:48:28.195838 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerStarted","Data":"607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2"} Feb 28 09:48:29 crc kubenswrapper[4687]: I0228 09:48:29.206464 4687 generic.go:334] "Generic (PLEG): container finished" podID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerID="607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2" exitCode=0 Feb 28 09:48:29 crc kubenswrapper[4687]: I0228 09:48:29.206582 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerDied","Data":"607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2"} Feb 28 09:48:30 crc kubenswrapper[4687]: I0228 09:48:30.220845 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerStarted","Data":"cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d"} Feb 28 09:48:30 crc kubenswrapper[4687]: I0228 09:48:30.240765 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sptsb" podStartSLOduration=1.727386304 podStartE2EDuration="4.240735247s" podCreationTimestamp="2026-02-28 09:48:26 +0000 UTC" firstStartedPulling="2026-02-28 09:48:27.187176152 +0000 UTC m=+2698.877745489" lastFinishedPulling="2026-02-28 09:48:29.700525095 +0000 UTC m=+2701.391094432" observedRunningTime="2026-02-28 09:48:30.236685548 +0000 UTC m=+2701.927254895" watchObservedRunningTime="2026-02-28 09:48:30.240735247 +0000 UTC m=+2701.931304584" Feb 28 09:48:36 crc kubenswrapper[4687]: I0228 09:48:36.361979 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:36 crc kubenswrapper[4687]: I0228 09:48:36.363399 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:36 crc kubenswrapper[4687]: I0228 09:48:36.396877 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:37 crc kubenswrapper[4687]: I0228 09:48:37.316302 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:37 crc kubenswrapper[4687]: I0228 09:48:37.366573 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sptsb"] Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.293469 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sptsb" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="registry-server" containerID="cri-o://cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d" gracePeriod=2 Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.752790 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.958130 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsdq\" (UniqueName: \"kubernetes.io/projected/6acd9179-c4a0-409c-8fab-1591bdde7d2f-kube-api-access-wvsdq\") pod \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.958215 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-utilities\") pod \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.958297 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-catalog-content\") pod \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\" (UID: \"6acd9179-c4a0-409c-8fab-1591bdde7d2f\") " Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.959100 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-utilities" (OuterVolumeSpecName: "utilities") pod "6acd9179-c4a0-409c-8fab-1591bdde7d2f" (UID: "6acd9179-c4a0-409c-8fab-1591bdde7d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:48:39 crc kubenswrapper[4687]: I0228 09:48:39.965548 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acd9179-c4a0-409c-8fab-1591bdde7d2f-kube-api-access-wvsdq" (OuterVolumeSpecName: "kube-api-access-wvsdq") pod "6acd9179-c4a0-409c-8fab-1591bdde7d2f" (UID: "6acd9179-c4a0-409c-8fab-1591bdde7d2f"). InnerVolumeSpecName "kube-api-access-wvsdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.005636 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6acd9179-c4a0-409c-8fab-1591bdde7d2f" (UID: "6acd9179-c4a0-409c-8fab-1591bdde7d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.061305 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsdq\" (UniqueName: \"kubernetes.io/projected/6acd9179-c4a0-409c-8fab-1591bdde7d2f-kube-api-access-wvsdq\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.061349 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.061360 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6acd9179-c4a0-409c-8fab-1591bdde7d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.321757 4687 generic.go:334] "Generic (PLEG): container finished" podID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerID="cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d" exitCode=0 Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.321831 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerDied","Data":"cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d"} Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.321868 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sptsb" event={"ID":"6acd9179-c4a0-409c-8fab-1591bdde7d2f","Type":"ContainerDied","Data":"e421441508bdb21235abf50025be0adda30242e318d100834db4f21a3100634a"} Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.321883 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sptsb" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.321890 4687 scope.go:117] "RemoveContainer" containerID="cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.340074 4687 scope.go:117] "RemoveContainer" containerID="607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.358933 4687 scope.go:117] "RemoveContainer" containerID="de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.367067 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sptsb"] Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.372753 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sptsb"] Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.394210 4687 scope.go:117] "RemoveContainer" containerID="cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d" Feb 28 09:48:40 crc kubenswrapper[4687]: E0228 09:48:40.394635 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d\": container with ID starting with cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d not found: ID does not exist" containerID="cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.394693 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d"} err="failed to get container status \"cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d\": rpc error: code = NotFound desc = could not find container \"cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d\": container with ID starting with cd308810221c2978b288940b38e47378feb859b344b08b2ba23e15fbd818b80d not found: ID does not exist" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.394722 4687 scope.go:117] "RemoveContainer" containerID="607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2" Feb 28 09:48:40 crc kubenswrapper[4687]: E0228 09:48:40.394967 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2\": container with ID starting with 607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2 not found: ID does not exist" containerID="607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.394996 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2"} err="failed to get container status \"607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2\": rpc error: code = NotFound desc = could not find container \"607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2\": container with ID starting with 607f941712a375def75a43e56a32f9c6b0977a5e52c874316073cb54a88c05e2 not found: ID does not exist" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.395068 4687 scope.go:117] "RemoveContainer" containerID="de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960" Feb 28 09:48:40 crc kubenswrapper[4687]: E0228 09:48:40.395417 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960\": container with ID starting with de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960 not found: ID does not exist" containerID="de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.395452 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960"} err="failed to get container status \"de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960\": rpc error: code = NotFound desc = could not find container \"de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960\": container with ID starting with de046f05c95385825e767d2c0efd76c1935aa8f254dd577d8011de7f9d907960 not found: ID does not exist" Feb 28 09:48:40 crc kubenswrapper[4687]: I0228 09:48:40.668645 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" path="/var/lib/kubelet/pods/6acd9179-c4a0-409c-8fab-1591bdde7d2f/volumes" Feb 28 09:48:43 crc kubenswrapper[4687]: I0228 09:48:43.701808 4687 scope.go:117] "RemoveContainer" containerID="0cc4e2d2f047062a2998acae3525c7abe8a05ff34543fc14766427ed720cc3ac" Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.002563 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.003304 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.003367 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.003940 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"193f4e131507074613a20b8d12c9de80ed9e99fe06c33cfd5df2585fad845b32"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.004008 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://193f4e131507074613a20b8d12c9de80ed9e99fe06c33cfd5df2585fad845b32" gracePeriod=600 Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.470465 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="193f4e131507074613a20b8d12c9de80ed9e99fe06c33cfd5df2585fad845b32" exitCode=0 Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.470532 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"193f4e131507074613a20b8d12c9de80ed9e99fe06c33cfd5df2585fad845b32"} Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.470750 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287"} Feb 28 09:48:55 crc kubenswrapper[4687]: I0228 09:48:55.470770 4687 scope.go:117] "RemoveContainer" containerID="483b364d23bb1afce74ade66e1f0d36515560f1be33f953e717c225db6654fc9" Feb 28 09:49:48 crc kubenswrapper[4687]: I0228 09:49:48.656917 4687 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-fdfb795c-sf6nb" podUID="10b30927-e15b-4464-b5e4-1245c90ce5f8" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.146397 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537870-bbqwn"] Feb 28 09:50:00 crc kubenswrapper[4687]: E0228 09:50:00.147730 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="extract-utilities" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.147746 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="extract-utilities" Feb 28 09:50:00 crc kubenswrapper[4687]: E0228 09:50:00.147787 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="registry-server" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.147793 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="registry-server" Feb 28 09:50:00 crc kubenswrapper[4687]: E0228 09:50:00.147808 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="extract-content" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.147815 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="extract-content" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.148042 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acd9179-c4a0-409c-8fab-1591bdde7d2f" containerName="registry-server" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.149296 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.151117 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.151603 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.151767 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.157178 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537870-bbqwn"] Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.283485 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v7d9\" (UniqueName: \"kubernetes.io/projected/a908c220-334f-497e-a077-5ef0b42d1966-kube-api-access-5v7d9\") pod \"auto-csr-approver-29537870-bbqwn\" (UID: \"a908c220-334f-497e-a077-5ef0b42d1966\") " pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.386631 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v7d9\" (UniqueName: \"kubernetes.io/projected/a908c220-334f-497e-a077-5ef0b42d1966-kube-api-access-5v7d9\") pod \"auto-csr-approver-29537870-bbqwn\" (UID: \"a908c220-334f-497e-a077-5ef0b42d1966\") " pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.405578 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v7d9\" (UniqueName: \"kubernetes.io/projected/a908c220-334f-497e-a077-5ef0b42d1966-kube-api-access-5v7d9\") pod \"auto-csr-approver-29537870-bbqwn\" (UID: \"a908c220-334f-497e-a077-5ef0b42d1966\") " pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.478996 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.898924 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537870-bbqwn"] Feb 28 09:50:00 crc kubenswrapper[4687]: I0228 09:50:00.908100 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:50:01 crc kubenswrapper[4687]: I0228 09:50:01.088530 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" event={"ID":"a908c220-334f-497e-a077-5ef0b42d1966","Type":"ContainerStarted","Data":"54acc2a992c382015d4d8e6c686c0a2c450a878d96cd87be8e856c50a65d71eb"} Feb 28 09:50:02 crc kubenswrapper[4687]: I0228 09:50:02.100819 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" event={"ID":"a908c220-334f-497e-a077-5ef0b42d1966","Type":"ContainerStarted","Data":"df4fa89dc0c4bbc7e6438e7b211fa5c065a99ef15ffc17eadc6b168db4fd2e4c"} Feb 28 09:50:02 crc kubenswrapper[4687]: I0228 09:50:02.124532 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" podStartSLOduration=1.175314847 podStartE2EDuration="2.124508988s" podCreationTimestamp="2026-02-28 09:50:00 +0000 UTC" firstStartedPulling="2026-02-28 09:50:00.90780888 +0000 UTC m=+2792.598378218" lastFinishedPulling="2026-02-28 09:50:01.857003022 +0000 UTC m=+2793.547572359" observedRunningTime="2026-02-28 09:50:02.118576756 +0000 UTC m=+2793.809146103" watchObservedRunningTime="2026-02-28 09:50:02.124508988 +0000 UTC m=+2793.815078325" Feb 28 09:50:03 crc kubenswrapper[4687]: I0228 09:50:03.110677 4687 generic.go:334] "Generic (PLEG): container finished" podID="a908c220-334f-497e-a077-5ef0b42d1966" containerID="df4fa89dc0c4bbc7e6438e7b211fa5c065a99ef15ffc17eadc6b168db4fd2e4c" exitCode=0 Feb 28 09:50:03 crc kubenswrapper[4687]: I0228 09:50:03.110742 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" event={"ID":"a908c220-334f-497e-a077-5ef0b42d1966","Type":"ContainerDied","Data":"df4fa89dc0c4bbc7e6438e7b211fa5c065a99ef15ffc17eadc6b168db4fd2e4c"} Feb 28 09:50:04 crc kubenswrapper[4687]: I0228 09:50:04.419383 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:04 crc kubenswrapper[4687]: I0228 09:50:04.502295 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v7d9\" (UniqueName: \"kubernetes.io/projected/a908c220-334f-497e-a077-5ef0b42d1966-kube-api-access-5v7d9\") pod \"a908c220-334f-497e-a077-5ef0b42d1966\" (UID: \"a908c220-334f-497e-a077-5ef0b42d1966\") " Feb 28 09:50:04 crc kubenswrapper[4687]: I0228 09:50:04.510738 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a908c220-334f-497e-a077-5ef0b42d1966-kube-api-access-5v7d9" (OuterVolumeSpecName: "kube-api-access-5v7d9") pod "a908c220-334f-497e-a077-5ef0b42d1966" (UID: "a908c220-334f-497e-a077-5ef0b42d1966"). InnerVolumeSpecName "kube-api-access-5v7d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:50:04 crc kubenswrapper[4687]: I0228 09:50:04.605175 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v7d9\" (UniqueName: \"kubernetes.io/projected/a908c220-334f-497e-a077-5ef0b42d1966-kube-api-access-5v7d9\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:05 crc kubenswrapper[4687]: I0228 09:50:05.130402 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" event={"ID":"a908c220-334f-497e-a077-5ef0b42d1966","Type":"ContainerDied","Data":"54acc2a992c382015d4d8e6c686c0a2c450a878d96cd87be8e856c50a65d71eb"} Feb 28 09:50:05 crc kubenswrapper[4687]: I0228 09:50:05.130450 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54acc2a992c382015d4d8e6c686c0a2c450a878d96cd87be8e856c50a65d71eb" Feb 28 09:50:05 crc kubenswrapper[4687]: I0228 09:50:05.130463 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537870-bbqwn" Feb 28 09:50:05 crc kubenswrapper[4687]: I0228 09:50:05.176904 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537864-sslns"] Feb 28 09:50:05 crc kubenswrapper[4687]: I0228 09:50:05.182623 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537864-sslns"] Feb 28 09:50:06 crc kubenswrapper[4687]: I0228 09:50:06.666432 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6299991-2d4b-4e15-93e5-4fc11d251557" path="/var/lib/kubelet/pods/d6299991-2d4b-4e15-93e5-4fc11d251557/volumes" Feb 28 09:50:35 crc kubenswrapper[4687]: I0228 09:50:35.410740 4687 generic.go:334] "Generic (PLEG): container finished" podID="e3d191c1-f8c8-455f-848c-a3d0a7caaf81" containerID="1f5a5aebbc6dc09566d977938b19d5c3fbb5d71509901c67394ed1d4ce42b645" exitCode=0 Feb 28 09:50:35 crc kubenswrapper[4687]: I0228 09:50:35.410785 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3d191c1-f8c8-455f-848c-a3d0a7caaf81","Type":"ContainerDied","Data":"1f5a5aebbc6dc09566d977938b19d5c3fbb5d71509901c67394ed1d4ce42b645"} Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.707372 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753243 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753300 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-temporary\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753326 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753359 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ca-certs\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753395 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config-secret\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753431 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-workdir\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753457 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-config-data\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753499 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4x28\" (UniqueName: \"kubernetes.io/projected/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-kube-api-access-p4x28\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.753538 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ssh-key\") pod \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\" (UID: \"e3d191c1-f8c8-455f-848c-a3d0a7caaf81\") " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.754080 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.755063 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-config-data" (OuterVolumeSpecName: "config-data") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.758257 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-kube-api-access-p4x28" (OuterVolumeSpecName: "kube-api-access-p4x28") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "kube-api-access-p4x28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.762206 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.767071 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.777304 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.783277 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.784581 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.794858 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e3d191c1-f8c8-455f-848c-a3d0a7caaf81" (UID: "e3d191c1-f8c8-455f-848c-a3d0a7caaf81"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855359 4687 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855583 4687 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855663 4687 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855733 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855793 4687 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855851 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855908 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4x28\" (UniqueName: \"kubernetes.io/projected/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-kube-api-access-p4x28\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.855964 4687 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.856036 4687 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3d191c1-f8c8-455f-848c-a3d0a7caaf81-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.885252 4687 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 28 09:50:36 crc kubenswrapper[4687]: I0228 09:50:36.958106 4687 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 28 09:50:37 crc kubenswrapper[4687]: I0228 09:50:37.432246 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3d191c1-f8c8-455f-848c-a3d0a7caaf81","Type":"ContainerDied","Data":"f26d7650fac8251cf40cacca947bbb0db3dc3a01c49d31464219350200af564f"} Feb 28 09:50:37 crc kubenswrapper[4687]: I0228 09:50:37.432307 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f26d7650fac8251cf40cacca947bbb0db3dc3a01c49d31464219350200af564f" Feb 28 09:50:37 crc kubenswrapper[4687]: I0228 09:50:37.432353 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 28 09:50:43 crc kubenswrapper[4687]: I0228 09:50:43.836531 4687 scope.go:117] "RemoveContainer" containerID="bf72747927e3e81008044df684b9cdf57d54632515a57c81695c59b4974c19b6" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.071080 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 28 09:50:45 crc kubenswrapper[4687]: E0228 09:50:45.072469 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d191c1-f8c8-455f-848c-a3d0a7caaf81" containerName="tempest-tests-tempest-tests-runner" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.072548 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d191c1-f8c8-455f-848c-a3d0a7caaf81" containerName="tempest-tests-tempest-tests-runner" Feb 28 09:50:45 crc kubenswrapper[4687]: E0228 09:50:45.072629 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908c220-334f-497e-a077-5ef0b42d1966" containerName="oc" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.072693 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908c220-334f-497e-a077-5ef0b42d1966" containerName="oc" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.072952 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d191c1-f8c8-455f-848c-a3d0a7caaf81" containerName="tempest-tests-tempest-tests-runner" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.073061 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908c220-334f-497e-a077-5ef0b42d1966" containerName="oc" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.073822 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.075544 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wqjcg" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.076469 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.230813 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.231215 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gw2l\" (UniqueName: \"kubernetes.io/projected/67a962d7-9b93-4db0-84cc-cd340793023d-kube-api-access-6gw2l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.334330 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gw2l\" (UniqueName: \"kubernetes.io/projected/67a962d7-9b93-4db0-84cc-cd340793023d-kube-api-access-6gw2l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.334924 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.335530 4687 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.357050 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gw2l\" (UniqueName: \"kubernetes.io/projected/67a962d7-9b93-4db0-84cc-cd340793023d-kube-api-access-6gw2l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.359014 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"67a962d7-9b93-4db0-84cc-cd340793023d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.395369 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 28 09:50:45 crc kubenswrapper[4687]: I0228 09:50:45.937405 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 28 09:50:46 crc kubenswrapper[4687]: I0228 09:50:46.518627 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"67a962d7-9b93-4db0-84cc-cd340793023d","Type":"ContainerStarted","Data":"13b95373ac428947513feaadd344ab291075a53b0ec56ce8b72fabff5456bd26"} Feb 28 09:50:47 crc kubenswrapper[4687]: I0228 09:50:47.531639 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"67a962d7-9b93-4db0-84cc-cd340793023d","Type":"ContainerStarted","Data":"2625ff63822f15a9b6d5032a1fc2b85cc171581ce0ce0598efd1cef691ecd92f"} Feb 28 09:50:47 crc kubenswrapper[4687]: I0228 09:50:47.547783 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.764163994 podStartE2EDuration="2.547763017s" podCreationTimestamp="2026-02-28 09:50:45 +0000 UTC" firstStartedPulling="2026-02-28 09:50:45.945988781 +0000 UTC m=+2837.636558118" lastFinishedPulling="2026-02-28 09:50:46.729587804 +0000 UTC m=+2838.420157141" observedRunningTime="2026-02-28 09:50:47.546808462 +0000 UTC m=+2839.237377809" watchObservedRunningTime="2026-02-28 09:50:47.547763017 +0000 UTC m=+2839.238332345" Feb 28 09:50:55 crc kubenswrapper[4687]: I0228 09:50:55.003198 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:50:55 crc kubenswrapper[4687]: I0228 09:50:55.003933 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.407298 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jjxt/must-gather-rjvgr"] Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.410825 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.412958 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8jjxt"/"kube-root-ca.crt" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.419711 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8jjxt"/"openshift-service-ca.crt" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.421080 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jjxt/must-gather-rjvgr"] Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.570462 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6g8\" (UniqueName: \"kubernetes.io/projected/ab705756-374f-437c-bf57-49e79e72cdc1-kube-api-access-nh6g8\") pod \"must-gather-rjvgr\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.570884 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab705756-374f-437c-bf57-49e79e72cdc1-must-gather-output\") pod \"must-gather-rjvgr\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.673888 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab705756-374f-437c-bf57-49e79e72cdc1-must-gather-output\") pod \"must-gather-rjvgr\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.674040 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6g8\" (UniqueName: \"kubernetes.io/projected/ab705756-374f-437c-bf57-49e79e72cdc1-kube-api-access-nh6g8\") pod \"must-gather-rjvgr\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.674409 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab705756-374f-437c-bf57-49e79e72cdc1-must-gather-output\") pod \"must-gather-rjvgr\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.692077 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6g8\" (UniqueName: \"kubernetes.io/projected/ab705756-374f-437c-bf57-49e79e72cdc1-kube-api-access-nh6g8\") pod \"must-gather-rjvgr\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:05 crc kubenswrapper[4687]: I0228 09:51:05.729250 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:51:06 crc kubenswrapper[4687]: I0228 09:51:06.133834 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jjxt/must-gather-rjvgr"] Feb 28 09:51:06 crc kubenswrapper[4687]: I0228 09:51:06.708083 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" event={"ID":"ab705756-374f-437c-bf57-49e79e72cdc1","Type":"ContainerStarted","Data":"ca1dacf792b81304a50b7f5aac5b75c75917bf3ae7dc2dd3dc326baf6736e6a0"} Feb 28 09:51:12 crc kubenswrapper[4687]: I0228 09:51:12.777115 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" event={"ID":"ab705756-374f-437c-bf57-49e79e72cdc1","Type":"ContainerStarted","Data":"6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235"} Feb 28 09:51:12 crc kubenswrapper[4687]: I0228 09:51:12.777933 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" event={"ID":"ab705756-374f-437c-bf57-49e79e72cdc1","Type":"ContainerStarted","Data":"5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220"} Feb 28 09:51:12 crc kubenswrapper[4687]: I0228 09:51:12.793506 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" podStartSLOduration=2.091790361 podStartE2EDuration="7.793487703s" podCreationTimestamp="2026-02-28 09:51:05 +0000 UTC" firstStartedPulling="2026-02-28 09:51:06.144672133 +0000 UTC m=+2857.835241470" lastFinishedPulling="2026-02-28 09:51:11.846369475 +0000 UTC m=+2863.536938812" observedRunningTime="2026-02-28 09:51:12.79040104 +0000 UTC m=+2864.480970378" watchObservedRunningTime="2026-02-28 09:51:12.793487703 +0000 UTC m=+2864.484057040" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.112099 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-sk9hn"] Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.113781 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.115827 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8jjxt"/"default-dockercfg-nvtjh" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.308436 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9hj\" (UniqueName: \"kubernetes.io/projected/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-kube-api-access-xt9hj\") pod \"crc-debug-sk9hn\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.308739 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-host\") pod \"crc-debug-sk9hn\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.411586 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-host\") pod \"crc-debug-sk9hn\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.411761 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-host\") pod \"crc-debug-sk9hn\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.411768 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9hj\" (UniqueName: \"kubernetes.io/projected/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-kube-api-access-xt9hj\") pod \"crc-debug-sk9hn\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.430888 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9hj\" (UniqueName: \"kubernetes.io/projected/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-kube-api-access-xt9hj\") pod \"crc-debug-sk9hn\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.731111 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:15 crc kubenswrapper[4687]: W0228 09:51:15.760570 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9fa1db2_ab47_4647_8e29_2a56556f8bc6.slice/crio-ec40192bb04769b4eba59daa394ed04d80900f5d1c668859311dff818885ca37 WatchSource:0}: Error finding container ec40192bb04769b4eba59daa394ed04d80900f5d1c668859311dff818885ca37: Status 404 returned error can't find the container with id ec40192bb04769b4eba59daa394ed04d80900f5d1c668859311dff818885ca37 Feb 28 09:51:15 crc kubenswrapper[4687]: I0228 09:51:15.806353 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" event={"ID":"e9fa1db2-ab47-4647-8e29-2a56556f8bc6","Type":"ContainerStarted","Data":"ec40192bb04769b4eba59daa394ed04d80900f5d1c668859311dff818885ca37"} Feb 28 09:51:25 crc kubenswrapper[4687]: I0228 09:51:25.002726 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:51:25 crc kubenswrapper[4687]: I0228 09:51:25.003267 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:51:26 crc kubenswrapper[4687]: I0228 09:51:26.894347 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" event={"ID":"e9fa1db2-ab47-4647-8e29-2a56556f8bc6","Type":"ContainerStarted","Data":"22f7bafdae776385b74ea7bb22abddd4d0fe18376516cd5a81a85e04e2bfc2b3"} Feb 28 09:51:26 crc kubenswrapper[4687]: I0228 09:51:26.909446 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" podStartSLOduration=1.6652928500000002 podStartE2EDuration="11.909426384s" podCreationTimestamp="2026-02-28 09:51:15 +0000 UTC" firstStartedPulling="2026-02-28 09:51:15.764809743 +0000 UTC m=+2867.455379079" lastFinishedPulling="2026-02-28 09:51:26.008943276 +0000 UTC m=+2877.699512613" observedRunningTime="2026-02-28 09:51:26.903326577 +0000 UTC m=+2878.593895914" watchObservedRunningTime="2026-02-28 09:51:26.909426384 +0000 UTC m=+2878.599995721" Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.002526 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.003236 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.003305 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.003951 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.004005 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" gracePeriod=600 Feb 28 09:51:55 crc kubenswrapper[4687]: E0228 09:51:55.128674 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.147994 4687 generic.go:334] "Generic (PLEG): container finished" podID="e9fa1db2-ab47-4647-8e29-2a56556f8bc6" containerID="22f7bafdae776385b74ea7bb22abddd4d0fe18376516cd5a81a85e04e2bfc2b3" exitCode=0 Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.148074 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" event={"ID":"e9fa1db2-ab47-4647-8e29-2a56556f8bc6","Type":"ContainerDied","Data":"22f7bafdae776385b74ea7bb22abddd4d0fe18376516cd5a81a85e04e2bfc2b3"} Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.157809 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" exitCode=0 Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.157846 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287"} Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.157873 4687 scope.go:117] "RemoveContainer" containerID="193f4e131507074613a20b8d12c9de80ed9e99fe06c33cfd5df2585fad845b32" Feb 28 09:51:55 crc kubenswrapper[4687]: I0228 09:51:55.158192 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:51:55 crc kubenswrapper[4687]: E0228 09:51:55.158442 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.242085 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.277928 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-sk9hn"] Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.286334 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-sk9hn"] Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.382058 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9hj\" (UniqueName: \"kubernetes.io/projected/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-kube-api-access-xt9hj\") pod \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.382190 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-host\") pod \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\" (UID: \"e9fa1db2-ab47-4647-8e29-2a56556f8bc6\") " Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.382973 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-host" (OuterVolumeSpecName: "host") pod "e9fa1db2-ab47-4647-8e29-2a56556f8bc6" (UID: "e9fa1db2-ab47-4647-8e29-2a56556f8bc6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.383270 4687 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-host\") on node \"crc\" DevicePath \"\"" Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.389822 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-kube-api-access-xt9hj" (OuterVolumeSpecName: "kube-api-access-xt9hj") pod "e9fa1db2-ab47-4647-8e29-2a56556f8bc6" (UID: "e9fa1db2-ab47-4647-8e29-2a56556f8bc6"). InnerVolumeSpecName "kube-api-access-xt9hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.485367 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9hj\" (UniqueName: \"kubernetes.io/projected/e9fa1db2-ab47-4647-8e29-2a56556f8bc6-kube-api-access-xt9hj\") on node \"crc\" DevicePath \"\"" Feb 28 09:51:56 crc kubenswrapper[4687]: I0228 09:51:56.666240 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9fa1db2-ab47-4647-8e29-2a56556f8bc6" path="/var/lib/kubelet/pods/e9fa1db2-ab47-4647-8e29-2a56556f8bc6/volumes" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.179036 4687 scope.go:117] "RemoveContainer" containerID="22f7bafdae776385b74ea7bb22abddd4d0fe18376516cd5a81a85e04e2bfc2b3" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.179088 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-sk9hn" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.410620 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-dsl4b"] Feb 28 09:51:57 crc kubenswrapper[4687]: E0228 09:51:57.411068 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fa1db2-ab47-4647-8e29-2a56556f8bc6" containerName="container-00" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.411083 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fa1db2-ab47-4647-8e29-2a56556f8bc6" containerName="container-00" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.411313 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fa1db2-ab47-4647-8e29-2a56556f8bc6" containerName="container-00" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.411983 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.413759 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8jjxt"/"default-dockercfg-nvtjh" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.507734 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d41dd3d-85df-40c0-93d3-4677be451c26-host\") pod \"crc-debug-dsl4b\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.507838 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp8fl\" (UniqueName: \"kubernetes.io/projected/1d41dd3d-85df-40c0-93d3-4677be451c26-kube-api-access-kp8fl\") pod \"crc-debug-dsl4b\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.610116 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d41dd3d-85df-40c0-93d3-4677be451c26-host\") pod \"crc-debug-dsl4b\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.610205 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp8fl\" (UniqueName: \"kubernetes.io/projected/1d41dd3d-85df-40c0-93d3-4677be451c26-kube-api-access-kp8fl\") pod \"crc-debug-dsl4b\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.610290 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d41dd3d-85df-40c0-93d3-4677be451c26-host\") pod \"crc-debug-dsl4b\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.631271 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp8fl\" (UniqueName: \"kubernetes.io/projected/1d41dd3d-85df-40c0-93d3-4677be451c26-kube-api-access-kp8fl\") pod \"crc-debug-dsl4b\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:57 crc kubenswrapper[4687]: I0228 09:51:57.726465 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:58 crc kubenswrapper[4687]: I0228 09:51:58.189947 4687 generic.go:334] "Generic (PLEG): container finished" podID="1d41dd3d-85df-40c0-93d3-4677be451c26" containerID="773af5d50c26d554ff73b27498a5c1854bf313669ac9bf6a3e5e87bf8fe4dbec" exitCode=0 Feb 28 09:51:58 crc kubenswrapper[4687]: I0228 09:51:58.189999 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" event={"ID":"1d41dd3d-85df-40c0-93d3-4677be451c26","Type":"ContainerDied","Data":"773af5d50c26d554ff73b27498a5c1854bf313669ac9bf6a3e5e87bf8fe4dbec"} Feb 28 09:51:58 crc kubenswrapper[4687]: I0228 09:51:58.190052 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" event={"ID":"1d41dd3d-85df-40c0-93d3-4677be451c26","Type":"ContainerStarted","Data":"33a21be70349e4187872c839f468f921e4007947d914412ef59f43a52fde6a3b"} Feb 28 09:51:58 crc kubenswrapper[4687]: I0228 09:51:58.639066 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-dsl4b"] Feb 28 09:51:58 crc kubenswrapper[4687]: I0228 09:51:58.646743 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-dsl4b"] Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.281132 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.345977 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d41dd3d-85df-40c0-93d3-4677be451c26-host\") pod \"1d41dd3d-85df-40c0-93d3-4677be451c26\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.346091 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d41dd3d-85df-40c0-93d3-4677be451c26-host" (OuterVolumeSpecName: "host") pod "1d41dd3d-85df-40c0-93d3-4677be451c26" (UID: "1d41dd3d-85df-40c0-93d3-4677be451c26"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.346278 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp8fl\" (UniqueName: \"kubernetes.io/projected/1d41dd3d-85df-40c0-93d3-4677be451c26-kube-api-access-kp8fl\") pod \"1d41dd3d-85df-40c0-93d3-4677be451c26\" (UID: \"1d41dd3d-85df-40c0-93d3-4677be451c26\") " Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.347102 4687 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d41dd3d-85df-40c0-93d3-4677be451c26-host\") on node \"crc\" DevicePath \"\"" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.352570 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d41dd3d-85df-40c0-93d3-4677be451c26-kube-api-access-kp8fl" (OuterVolumeSpecName: "kube-api-access-kp8fl") pod "1d41dd3d-85df-40c0-93d3-4677be451c26" (UID: "1d41dd3d-85df-40c0-93d3-4677be451c26"). InnerVolumeSpecName "kube-api-access-kp8fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.448963 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp8fl\" (UniqueName: \"kubernetes.io/projected/1d41dd3d-85df-40c0-93d3-4677be451c26-kube-api-access-kp8fl\") on node \"crc\" DevicePath \"\"" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.815633 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-7t4xk"] Feb 28 09:51:59 crc kubenswrapper[4687]: E0228 09:51:59.816090 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d41dd3d-85df-40c0-93d3-4677be451c26" containerName="container-00" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.816107 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d41dd3d-85df-40c0-93d3-4677be451c26" containerName="container-00" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.816314 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d41dd3d-85df-40c0-93d3-4677be451c26" containerName="container-00" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.817011 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.856006 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcf9j\" (UniqueName: \"kubernetes.io/projected/0da9027a-c15d-475c-b4ea-196e7b9889f5-kube-api-access-wcf9j\") pod \"crc-debug-7t4xk\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.856245 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0da9027a-c15d-475c-b4ea-196e7b9889f5-host\") pod \"crc-debug-7t4xk\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.959200 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcf9j\" (UniqueName: \"kubernetes.io/projected/0da9027a-c15d-475c-b4ea-196e7b9889f5-kube-api-access-wcf9j\") pod \"crc-debug-7t4xk\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.959277 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0da9027a-c15d-475c-b4ea-196e7b9889f5-host\") pod \"crc-debug-7t4xk\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.959463 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0da9027a-c15d-475c-b4ea-196e7b9889f5-host\") pod \"crc-debug-7t4xk\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:51:59 crc kubenswrapper[4687]: I0228 09:51:59.976408 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcf9j\" (UniqueName: \"kubernetes.io/projected/0da9027a-c15d-475c-b4ea-196e7b9889f5-kube-api-access-wcf9j\") pod \"crc-debug-7t4xk\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.133761 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.145155 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537872-4vzzf"] Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.146490 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.149665 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.154699 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.156503 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537872-4vzzf"] Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.164238 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qb6\" (UniqueName: \"kubernetes.io/projected/12154ab8-bd23-418d-a6d3-a1b4c8d51fad-kube-api-access-n6qb6\") pod \"auto-csr-approver-29537872-4vzzf\" (UID: \"12154ab8-bd23-418d-a6d3-a1b4c8d51fad\") " pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.171803 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.216057 4687 scope.go:117] "RemoveContainer" containerID="773af5d50c26d554ff73b27498a5c1854bf313669ac9bf6a3e5e87bf8fe4dbec" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.216182 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-dsl4b" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.223942 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" event={"ID":"0da9027a-c15d-475c-b4ea-196e7b9889f5","Type":"ContainerStarted","Data":"47837318f9bee98b0174ca9b2f99276db1fc4e429df001cf1647e66b85c9f5d2"} Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.266651 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6qb6\" (UniqueName: \"kubernetes.io/projected/12154ab8-bd23-418d-a6d3-a1b4c8d51fad-kube-api-access-n6qb6\") pod \"auto-csr-approver-29537872-4vzzf\" (UID: \"12154ab8-bd23-418d-a6d3-a1b4c8d51fad\") " pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.284719 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6qb6\" (UniqueName: \"kubernetes.io/projected/12154ab8-bd23-418d-a6d3-a1b4c8d51fad-kube-api-access-n6qb6\") pod \"auto-csr-approver-29537872-4vzzf\" (UID: \"12154ab8-bd23-418d-a6d3-a1b4c8d51fad\") " pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.498716 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.665711 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d41dd3d-85df-40c0-93d3-4677be451c26" path="/var/lib/kubelet/pods/1d41dd3d-85df-40c0-93d3-4677be451c26/volumes" Feb 28 09:52:00 crc kubenswrapper[4687]: I0228 09:52:00.924853 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537872-4vzzf"] Feb 28 09:52:00 crc kubenswrapper[4687]: W0228 09:52:00.934265 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12154ab8_bd23_418d_a6d3_a1b4c8d51fad.slice/crio-cfbf4283cf73e87905b54ec03fdbca96c24da2b03d77dd27c70690f1d891c754 WatchSource:0}: Error finding container cfbf4283cf73e87905b54ec03fdbca96c24da2b03d77dd27c70690f1d891c754: Status 404 returned error can't find the container with id cfbf4283cf73e87905b54ec03fdbca96c24da2b03d77dd27c70690f1d891c754 Feb 28 09:52:01 crc kubenswrapper[4687]: I0228 09:52:01.237301 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" event={"ID":"12154ab8-bd23-418d-a6d3-a1b4c8d51fad","Type":"ContainerStarted","Data":"cfbf4283cf73e87905b54ec03fdbca96c24da2b03d77dd27c70690f1d891c754"} Feb 28 09:52:01 crc kubenswrapper[4687]: I0228 09:52:01.239953 4687 generic.go:334] "Generic (PLEG): container finished" podID="0da9027a-c15d-475c-b4ea-196e7b9889f5" containerID="5f9b692dd0898e9814b32ac9a0d823ad7008935112e5a49d94c7e74b0f6be74f" exitCode=0 Feb 28 09:52:01 crc kubenswrapper[4687]: I0228 09:52:01.240005 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" event={"ID":"0da9027a-c15d-475c-b4ea-196e7b9889f5","Type":"ContainerDied","Data":"5f9b692dd0898e9814b32ac9a0d823ad7008935112e5a49d94c7e74b0f6be74f"} Feb 28 09:52:01 crc kubenswrapper[4687]: I0228 09:52:01.273551 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-7t4xk"] Feb 28 09:52:01 crc kubenswrapper[4687]: I0228 09:52:01.279393 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8jjxt/crc-debug-7t4xk"] Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.266982 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" event={"ID":"12154ab8-bd23-418d-a6d3-a1b4c8d51fad","Type":"ContainerStarted","Data":"d62de7ddf93325e029768f88eb8dcd9e33fb888a0fe194af3b5739de519a67fb"} Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.298706 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" podStartSLOduration=1.305056357 podStartE2EDuration="2.298687308s" podCreationTimestamp="2026-02-28 09:52:00 +0000 UTC" firstStartedPulling="2026-02-28 09:52:00.9366373 +0000 UTC m=+2912.627206637" lastFinishedPulling="2026-02-28 09:52:01.930268261 +0000 UTC m=+2913.620837588" observedRunningTime="2026-02-28 09:52:02.290764834 +0000 UTC m=+2913.981334172" watchObservedRunningTime="2026-02-28 09:52:02.298687308 +0000 UTC m=+2913.989256645" Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.347894 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.412849 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcf9j\" (UniqueName: \"kubernetes.io/projected/0da9027a-c15d-475c-b4ea-196e7b9889f5-kube-api-access-wcf9j\") pod \"0da9027a-c15d-475c-b4ea-196e7b9889f5\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.412914 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0da9027a-c15d-475c-b4ea-196e7b9889f5-host\") pod \"0da9027a-c15d-475c-b4ea-196e7b9889f5\" (UID: \"0da9027a-c15d-475c-b4ea-196e7b9889f5\") " Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.413830 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0da9027a-c15d-475c-b4ea-196e7b9889f5-host" (OuterVolumeSpecName: "host") pod "0da9027a-c15d-475c-b4ea-196e7b9889f5" (UID: "0da9027a-c15d-475c-b4ea-196e7b9889f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.419364 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da9027a-c15d-475c-b4ea-196e7b9889f5-kube-api-access-wcf9j" (OuterVolumeSpecName: "kube-api-access-wcf9j") pod "0da9027a-c15d-475c-b4ea-196e7b9889f5" (UID: "0da9027a-c15d-475c-b4ea-196e7b9889f5"). InnerVolumeSpecName "kube-api-access-wcf9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.515614 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcf9j\" (UniqueName: \"kubernetes.io/projected/0da9027a-c15d-475c-b4ea-196e7b9889f5-kube-api-access-wcf9j\") on node \"crc\" DevicePath \"\"" Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.515784 4687 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0da9027a-c15d-475c-b4ea-196e7b9889f5-host\") on node \"crc\" DevicePath \"\"" Feb 28 09:52:02 crc kubenswrapper[4687]: I0228 09:52:02.666654 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da9027a-c15d-475c-b4ea-196e7b9889f5" path="/var/lib/kubelet/pods/0da9027a-c15d-475c-b4ea-196e7b9889f5/volumes" Feb 28 09:52:03 crc kubenswrapper[4687]: I0228 09:52:03.277778 4687 scope.go:117] "RemoveContainer" containerID="5f9b692dd0898e9814b32ac9a0d823ad7008935112e5a49d94c7e74b0f6be74f" Feb 28 09:52:03 crc kubenswrapper[4687]: I0228 09:52:03.277790 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/crc-debug-7t4xk" Feb 28 09:52:03 crc kubenswrapper[4687]: I0228 09:52:03.281342 4687 generic.go:334] "Generic (PLEG): container finished" podID="12154ab8-bd23-418d-a6d3-a1b4c8d51fad" containerID="d62de7ddf93325e029768f88eb8dcd9e33fb888a0fe194af3b5739de519a67fb" exitCode=0 Feb 28 09:52:03 crc kubenswrapper[4687]: I0228 09:52:03.281397 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" event={"ID":"12154ab8-bd23-418d-a6d3-a1b4c8d51fad","Type":"ContainerDied","Data":"d62de7ddf93325e029768f88eb8dcd9e33fb888a0fe194af3b5739de519a67fb"} Feb 28 09:52:04 crc kubenswrapper[4687]: I0228 09:52:04.583236 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:04 crc kubenswrapper[4687]: I0228 09:52:04.663847 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6qb6\" (UniqueName: \"kubernetes.io/projected/12154ab8-bd23-418d-a6d3-a1b4c8d51fad-kube-api-access-n6qb6\") pod \"12154ab8-bd23-418d-a6d3-a1b4c8d51fad\" (UID: \"12154ab8-bd23-418d-a6d3-a1b4c8d51fad\") " Feb 28 09:52:04 crc kubenswrapper[4687]: I0228 09:52:04.671256 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12154ab8-bd23-418d-a6d3-a1b4c8d51fad-kube-api-access-n6qb6" (OuterVolumeSpecName: "kube-api-access-n6qb6") pod "12154ab8-bd23-418d-a6d3-a1b4c8d51fad" (UID: "12154ab8-bd23-418d-a6d3-a1b4c8d51fad"). InnerVolumeSpecName "kube-api-access-n6qb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:52:04 crc kubenswrapper[4687]: I0228 09:52:04.767771 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6qb6\" (UniqueName: \"kubernetes.io/projected/12154ab8-bd23-418d-a6d3-a1b4c8d51fad-kube-api-access-n6qb6\") on node \"crc\" DevicePath \"\"" Feb 28 09:52:05 crc kubenswrapper[4687]: I0228 09:52:05.303485 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" event={"ID":"12154ab8-bd23-418d-a6d3-a1b4c8d51fad","Type":"ContainerDied","Data":"cfbf4283cf73e87905b54ec03fdbca96c24da2b03d77dd27c70690f1d891c754"} Feb 28 09:52:05 crc kubenswrapper[4687]: I0228 09:52:05.303550 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfbf4283cf73e87905b54ec03fdbca96c24da2b03d77dd27c70690f1d891c754" Feb 28 09:52:05 crc kubenswrapper[4687]: I0228 09:52:05.303561 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537872-4vzzf" Feb 28 09:52:05 crc kubenswrapper[4687]: I0228 09:52:05.375776 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537866-fpbzs"] Feb 28 09:52:05 crc kubenswrapper[4687]: I0228 09:52:05.381851 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537866-fpbzs"] Feb 28 09:52:06 crc kubenswrapper[4687]: I0228 09:52:06.669365 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbba413-398b-4e6d-9f27-d7c3ce6bed48" path="/var/lib/kubelet/pods/ecbba413-398b-4e6d-9f27-d7c3ce6bed48/volumes" Feb 28 09:52:08 crc kubenswrapper[4687]: I0228 09:52:08.665293 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:52:08 crc kubenswrapper[4687]: E0228 09:52:08.665722 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.333359 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f95b8bb44-tjzcn_fa58d12c-eed3-46e2-915f-c8383b8949fe/barbican-api/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.517931 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6586f4f898-ssm26_cc722f81-31b0-44eb-8206-4256e2ae12f0/barbican-keystone-listener/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.528694 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f95b8bb44-tjzcn_fa58d12c-eed3-46e2-915f-c8383b8949fe/barbican-api-log/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.577368 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6586f4f898-ssm26_cc722f81-31b0-44eb-8206-4256e2ae12f0/barbican-keystone-listener-log/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.706687 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f58cc8c7c-dxx99_5ec85d56-f00e-4193-b4eb-ae0d43a13ffa/barbican-worker/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.712425 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f58cc8c7c-dxx99_5ec85d56-f00e-4193-b4eb-ae0d43a13ffa/barbican-worker-log/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.902681 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls_e607377f-9f4c-4f40-8d5c-17487eb054b8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:16 crc kubenswrapper[4687]: I0228 09:52:16.947581 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/ceilometer-central-agent/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.019925 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/ceilometer-notification-agent/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.105942 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/proxy-httpd/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.121995 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/sg-core/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.204069 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c7902e63-a118-4905-ad9d-3a4d15edce78/cinder-api/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.271967 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c7902e63-a118-4905-ad9d-3a4d15edce78/cinder-api-log/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.361146 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0e9f0b9e-618d-409d-b76f-5da56783af17/cinder-scheduler/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.452777 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0e9f0b9e-618d-409d-b76f-5da56783af17/probe/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.507937 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4_f3bbc9b7-2863-45fb-a890-fba1253b1f63/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.667186 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5_8a947fbc-4fb5-4be7-819c-703c45480b29/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.774694 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-9dbr2_a8f2c1ae-1407-4d58-86af-05f1f1311d1a/init/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.941222 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-9dbr2_a8f2c1ae-1407-4d58-86af-05f1f1311d1a/init/0.log" Feb 28 09:52:17 crc kubenswrapper[4687]: I0228 09:52:17.986934 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-9dbr2_a8f2c1ae-1407-4d58-86af-05f1f1311d1a/dnsmasq-dns/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.018917 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt_2162138d-1397-4721-adeb-73e30bf37580/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.163121 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_df7927ff-9e46-45c4-8f30-f55742dda755/glance-httpd/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.199275 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_df7927ff-9e46-45c4-8f30-f55742dda755/glance-log/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.324680 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8ab8b6d1-f4d6-4206-94a9-14e1770f672a/glance-log/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.342851 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8ab8b6d1-f4d6-4206-94a9-14e1770f672a/glance-httpd/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.503711 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9587f844-jq5pd_113841cd-f813-4ee0-93cf-2e3cfb43f6fc/horizon/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.671378 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k_d966dc9f-36d1-4236-8839-0f9794c0e663/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.778066 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9587f844-jq5pd_113841cd-f813-4ee0-93cf-2e3cfb43f6fc/horizon-log/0.log" Feb 28 09:52:18 crc kubenswrapper[4687]: I0228 09:52:18.834542 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gdmsq_380b1201-b6ba-48e4-b282-fad4f9b945d7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:19 crc kubenswrapper[4687]: I0228 09:52:19.020432 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8685d6f5dd-ndtlf_8fcd0fba-03d4-4584-b991-7f719e04b98d/keystone-api/0.log" Feb 28 09:52:19 crc kubenswrapper[4687]: I0228 09:52:19.052220 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_42ce0499-adfd-41cd-9f90-db487bc7c7a0/kube-state-metrics/0.log" Feb 28 09:52:19 crc kubenswrapper[4687]: I0228 09:52:19.195714 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9_6fb2570c-4ba8-41f6-83a3-038b8ab54177/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:19 crc kubenswrapper[4687]: I0228 09:52:19.513734 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6bd86ccc79-8jlb2_2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a/neutron-api/0.log" Feb 28 09:52:19 crc kubenswrapper[4687]: I0228 09:52:19.592702 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6bd86ccc79-8jlb2_2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a/neutron-httpd/0.log" Feb 28 09:52:19 crc kubenswrapper[4687]: I0228 09:52:19.753746 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8_29b1d03b-8788-4d8d-8105-700b9cfe905a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.208319 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7060db5b-32fc-481f-a4d6-520e585175b7/nova-cell0-conductor-conductor/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.263951 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a36f861b-f068-4184-bca3-ef07c5d8cec5/nova-api-log/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.351943 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a36f861b-f068-4184-bca3-ef07c5d8cec5/nova-api-api/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.472522 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e45dcf0c-b04a-4ae5-9488-2051b3ea91df/nova-cell1-conductor-conductor/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.520904 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_02b56b91-2ca9-4bea-b8d4-ad653daa91b8/nova-cell1-novncproxy-novncproxy/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.694415 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dv48x_b0b65af5-abae-4587-abda-dfda34ed0d0b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:20 crc kubenswrapper[4687]: I0228 09:52:20.804862 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8d4ccf04-08de-4138-ba4a-b8f5659a37fc/nova-metadata-log/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.103662 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5586e3ed-9ec4-4c0f-9d31-57120488f2cd/nova-scheduler-scheduler/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.168980 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1fe0178-db8f-44e3-9e53-a2450914080a/mysql-bootstrap/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.285648 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1fe0178-db8f-44e3-9e53-a2450914080a/mysql-bootstrap/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.360650 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1fe0178-db8f-44e3-9e53-a2450914080a/galera/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.484371 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c1fac181-ae33-45e1-8171-1d998d59bc04/mysql-bootstrap/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.541084 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8d4ccf04-08de-4138-ba4a-b8f5659a37fc/nova-metadata-metadata/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.604187 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c1fac181-ae33-45e1-8171-1d998d59bc04/mysql-bootstrap/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.653204 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c1fac181-ae33-45e1-8171-1d998d59bc04/galera/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.657295 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:52:21 crc kubenswrapper[4687]: E0228 09:52:21.657615 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.777875 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39/openstackclient/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.845531 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-grkmn_b7837572-8dcc-409d-b8fd-c37f2af52474/ovn-controller/0.log" Feb 28 09:52:21 crc kubenswrapper[4687]: I0228 09:52:21.990726 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-csrrp_d4f7bb81-e353-405c-9676-8a57d0886dae/openstack-network-exporter/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.092258 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovsdb-server-init/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.272823 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovs-vswitchd/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.295504 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovsdb-server-init/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.311769 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovsdb-server/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.490957 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_99f1dc2d-f77e-447b-836c-d485426a72c2/openstack-network-exporter/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.523647 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hcsz4_c1151261-c776-4190-ad84-46a4a3c68a6a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.593236 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_99f1dc2d-f77e-447b-836c-d485426a72c2/ovn-northd/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.740066 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695/openstack-network-exporter/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.837977 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695/ovsdbserver-nb/0.log" Feb 28 09:52:22 crc kubenswrapper[4687]: I0228 09:52:22.966453 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dcb66eab-811b-4162-a74b-2fc36e9e51b5/openstack-network-exporter/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.001843 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dcb66eab-811b-4162-a74b-2fc36e9e51b5/ovsdbserver-sb/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.200645 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d6696bd5b-vf747_0aa8b593-6c7b-438e-b95c-3f39081df0ea/placement-api/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.285207 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d6696bd5b-vf747_0aa8b593-6c7b-438e-b95c-3f39081df0ea/placement-log/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.286826 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_02945b48-0d0e-4c7c-8247-7b3060a6fc3c/setup-container/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.530919 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_02945b48-0d0e-4c7c-8247-7b3060a6fc3c/setup-container/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.561316 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0af13829-a7ca-4952-8e73-2923cc70ef98/setup-container/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.602671 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_02945b48-0d0e-4c7c-8247-7b3060a6fc3c/rabbitmq/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.795137 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0af13829-a7ca-4952-8e73-2923cc70ef98/setup-container/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.817982 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd_2bb3057f-10bb-43e9-af01-41131c5b6fb1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:23 crc kubenswrapper[4687]: I0228 09:52:23.835140 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0af13829-a7ca-4952-8e73-2923cc70ef98/rabbitmq/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.249667 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd_bb39766e-6294-4141-be47-7a7085460449/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.277673 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-lg8ps_5a7981ec-8e60-4379-af52-5188e5b53dcf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.460124 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bn6fw_47d00581-22fa-4c52-a057-6d757f969f52/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.538762 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-989zc_a605b600-b94d-4f23-9922-f9d8478cf6ef/ssh-known-hosts-edpm-deployment/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.733717 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-fdfb795c-sf6nb_10b30927-e15b-4464-b5e4-1245c90ce5f8/proxy-server/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.800040 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-fdfb795c-sf6nb_10b30927-e15b-4464-b5e4-1245c90ce5f8/proxy-httpd/0.log" Feb 28 09:52:24 crc kubenswrapper[4687]: I0228 09:52:24.947443 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s57nv_6cf929c8-d005-4feb-8eb4-544e89507ad9/swift-ring-rebalance/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.024946 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-reaper/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.030968 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-auditor/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.135566 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-replicator/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.163131 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-server/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.196198 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-auditor/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.305877 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-replicator/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.322978 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-server/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.362947 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-updater/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.444917 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-auditor/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.478036 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-expirer/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.534740 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-replicator/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.601304 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-server/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.670387 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-updater/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.680644 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/rsync/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.798659 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/swift-recon-cron/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.901496 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc_6f4d944c-dd63-414e-8886-5b38a982c01a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:25 crc kubenswrapper[4687]: I0228 09:52:25.992933 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e3d191c1-f8c8-455f-848c-a3d0a7caaf81/tempest-tests-tempest-tests-runner/0.log" Feb 28 09:52:26 crc kubenswrapper[4687]: I0228 09:52:26.044820 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_67a962d7-9b93-4db0-84cc-cd340793023d/test-operator-logs-container/0.log" Feb 28 09:52:26 crc kubenswrapper[4687]: I0228 09:52:26.225399 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zgthc_b83907ec-ac55-4f72-9265-e919fa57514a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:52:33 crc kubenswrapper[4687]: I0228 09:52:33.658495 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:52:33 crc kubenswrapper[4687]: E0228 09:52:33.659124 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:52:34 crc kubenswrapper[4687]: I0228 09:52:34.825406 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_48796fdd-f9c8-473a-b17f-c6da6d0ba3a5/memcached/0.log" Feb 28 09:52:43 crc kubenswrapper[4687]: I0228 09:52:43.930616 4687 scope.go:117] "RemoveContainer" containerID="ab8d80b72b56294558989f4e080d7f54740dd00a34a2db26c58f59ea249314ea" Feb 28 09:52:44 crc kubenswrapper[4687]: I0228 09:52:44.657928 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:52:44 crc kubenswrapper[4687]: E0228 09:52:44.658527 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:52:48 crc kubenswrapper[4687]: I0228 09:52:48.673678 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/util/0.log" Feb 28 09:52:48 crc kubenswrapper[4687]: I0228 09:52:48.873604 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/util/0.log" Feb 28 09:52:48 crc kubenswrapper[4687]: I0228 09:52:48.892374 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/pull/0.log" Feb 28 09:52:48 crc kubenswrapper[4687]: I0228 09:52:48.897899 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/pull/0.log" Feb 28 09:52:49 crc kubenswrapper[4687]: I0228 09:52:49.035160 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/util/0.log" Feb 28 09:52:49 crc kubenswrapper[4687]: I0228 09:52:49.064918 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/extract/0.log" Feb 28 09:52:49 crc kubenswrapper[4687]: I0228 09:52:49.065296 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/pull/0.log" Feb 28 09:52:49 crc kubenswrapper[4687]: I0228 09:52:49.430053 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-7wrs7_c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1/manager/0.log" Feb 28 09:52:49 crc kubenswrapper[4687]: I0228 09:52:49.787815 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-9zkzk_30b87ec4-ee50-402d-8afc-a3f9241bbc4c/manager/0.log" Feb 28 09:52:49 crc kubenswrapper[4687]: I0228 09:52:49.855781 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-ltpvl_5945c472-0f03-4666-84ca-b8f4545db411/manager/0.log" Feb 28 09:52:50 crc kubenswrapper[4687]: I0228 09:52:50.125678 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-v9vbd_0e2af601-594d-47f7-95ef-0474051dae27/manager/0.log" Feb 28 09:52:50 crc kubenswrapper[4687]: I0228 09:52:50.708085 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-9nm28_f5b51009-d199-4b88-9158-1b7b3b1848d3/manager/0.log" Feb 28 09:52:50 crc kubenswrapper[4687]: I0228 09:52:50.708615 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-chfpl_40ae4140-3768-425a-9791-234afb6297fe/manager/0.log" Feb 28 09:52:50 crc kubenswrapper[4687]: I0228 09:52:50.828738 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-vqdm7_caa33de5-0fe2-4930-bf89-0f8ad6a96ca2/manager/0.log" Feb 28 09:52:50 crc kubenswrapper[4687]: I0228 09:52:50.968683 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-8r8kv_14725449-2193-4b84-b736-31c04f9f43e4/manager/0.log" Feb 28 09:52:51 crc kubenswrapper[4687]: I0228 09:52:51.116505 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-jw6hs_89b24774-f0eb-4d63-a124-1b244f195163/manager/0.log" Feb 28 09:52:51 crc kubenswrapper[4687]: I0228 09:52:51.339650 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-jbzlm_a2ca8c5d-3391-4ae4-a451-8a14fe2352aa/manager/0.log" Feb 28 09:52:51 crc kubenswrapper[4687]: I0228 09:52:51.454142 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-hsvs9_09ff8e79-084a-4043-9061-c7007b041e86/manager/0.log" Feb 28 09:52:51 crc kubenswrapper[4687]: I0228 09:52:51.745489 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-kdxq5_72be3389-d521-4742-9081-8bdc3aef0dc6/manager/0.log" Feb 28 09:52:51 crc kubenswrapper[4687]: I0228 09:52:51.848173 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-dsfvj_134bd541-e4b0-4e84-b85d-a50c413d6cd2/manager/0.log" Feb 28 09:52:51 crc kubenswrapper[4687]: I0228 09:52:51.934753 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b4cc4776925xf7_e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe/manager/0.log" Feb 28 09:52:52 crc kubenswrapper[4687]: I0228 09:52:52.250874 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-595c94944c-4zqnh_fff03855-1690-4745-825d-919a9f9469ea/operator/0.log" Feb 28 09:52:52 crc kubenswrapper[4687]: I0228 09:52:52.456131 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qtbgc_c15f16ef-addd-4cba-b2c3-69b4691fa2c7/registry-server/0.log" Feb 28 09:52:52 crc kubenswrapper[4687]: I0228 09:52:52.647665 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-9fpjj_41e8cac0-417a-4c1d-a31c-0389bdebd0ba/manager/0.log" Feb 28 09:52:52 crc kubenswrapper[4687]: I0228 09:52:52.717210 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-jht6f_7f019778-ba45-4e4a-a6d8-dd6d056aed3b/manager/0.log" Feb 28 09:52:52 crc kubenswrapper[4687]: I0228 09:52:52.872916 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-p64nn_da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4/operator/0.log" Feb 28 09:52:53 crc kubenswrapper[4687]: I0228 09:52:53.094318 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-q5zdg_9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a/manager/0.log" Feb 28 09:52:53 crc kubenswrapper[4687]: I0228 09:52:53.295239 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-fxqv8_5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf/manager/0.log" Feb 28 09:52:53 crc kubenswrapper[4687]: I0228 09:52:53.298427 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-2t7hs_ccb38bca-46b2-4c3c-a6c5-d30af68435d1/manager/0.log" Feb 28 09:52:53 crc kubenswrapper[4687]: I0228 09:52:53.456759 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-c92d5_3ebd35dc-7a29-4c3f-b442-bfe29d833f06/manager/0.log" Feb 28 09:52:53 crc kubenswrapper[4687]: I0228 09:52:53.867711 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-864b865b94-72kg5_005ef854-8015-4724-b7b1-42f8fe9a1497/manager/0.log" Feb 28 09:52:55 crc kubenswrapper[4687]: I0228 09:52:55.733109 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-jtdtt_dc30956e-12c6-4973-a99f-ae4b502abb17/manager/0.log" Feb 28 09:52:56 crc kubenswrapper[4687]: I0228 09:52:56.665149 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:52:56 crc kubenswrapper[4687]: E0228 09:52:56.665875 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:53:09 crc kubenswrapper[4687]: I0228 09:53:09.657479 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:53:09 crc kubenswrapper[4687]: E0228 09:53:09.658571 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:53:11 crc kubenswrapper[4687]: I0228 09:53:11.228087 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-494jw_7461d892-4781-495c-b78f-5fe375ed4f44/control-plane-machine-set-operator/0.log" Feb 28 09:53:11 crc kubenswrapper[4687]: I0228 09:53:11.386551 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9thbt_9292d86c-b9c1-4a63-a766-c25874ffa2f5/kube-rbac-proxy/0.log" Feb 28 09:53:11 crc kubenswrapper[4687]: I0228 09:53:11.436421 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9thbt_9292d86c-b9c1-4a63-a766-c25874ffa2f5/machine-api-operator/0.log" Feb 28 09:53:21 crc kubenswrapper[4687]: I0228 09:53:21.657581 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:53:21 crc kubenswrapper[4687]: E0228 09:53:21.658299 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:53:22 crc kubenswrapper[4687]: I0228 09:53:22.766791 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jrbfz_5b4222a9-1f7a-48de-879a-4c5dc9d4d99d/cert-manager-controller/0.log" Feb 28 09:53:22 crc kubenswrapper[4687]: I0228 09:53:22.990260 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-h6sww_42c2a835-9620-4ed3-8dc5-dbe24b201af7/cert-manager-webhook/0.log" Feb 28 09:53:22 crc kubenswrapper[4687]: I0228 09:53:22.991038 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-gg5lw_45a33d4f-01db-48af-aa18-b0a18834a9ab/cert-manager-cainjector/0.log" Feb 28 09:53:34 crc kubenswrapper[4687]: I0228 09:53:34.764781 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-pq26r_fe724a47-e6db-4940-885f-318abb45fb46/nmstate-console-plugin/0.log" Feb 28 09:53:34 crc kubenswrapper[4687]: I0228 09:53:34.999206 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6lptp_02027bc3-0840-49ed-afe6-13d5285bdff9/nmstate-handler/0.log" Feb 28 09:53:35 crc kubenswrapper[4687]: I0228 09:53:35.045437 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-c66wz_88c47658-dd20-4f97-b063-b95f5bd2d79d/kube-rbac-proxy/0.log" Feb 28 09:53:35 crc kubenswrapper[4687]: I0228 09:53:35.172737 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-5sb42_8da54ae4-877e-4e38-890c-8eabef7c7033/nmstate-operator/0.log" Feb 28 09:53:35 crc kubenswrapper[4687]: I0228 09:53:35.179309 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-c66wz_88c47658-dd20-4f97-b063-b95f5bd2d79d/nmstate-metrics/0.log" Feb 28 09:53:35 crc kubenswrapper[4687]: I0228 09:53:35.359743 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-8kg5p_5dc19058-cbce-4742-9c1f-11005a9aefbf/nmstate-webhook/0.log" Feb 28 09:53:35 crc kubenswrapper[4687]: I0228 09:53:35.657193 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:53:35 crc kubenswrapper[4687]: E0228 09:53:35.657889 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:53:49 crc kubenswrapper[4687]: I0228 09:53:49.657220 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:53:49 crc kubenswrapper[4687]: E0228 09:53:49.659523 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.144072 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537874-8rbwr"] Feb 28 09:54:00 crc kubenswrapper[4687]: E0228 09:54:00.145863 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da9027a-c15d-475c-b4ea-196e7b9889f5" containerName="container-00" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.145936 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da9027a-c15d-475c-b4ea-196e7b9889f5" containerName="container-00" Feb 28 09:54:00 crc kubenswrapper[4687]: E0228 09:54:00.145996 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12154ab8-bd23-418d-a6d3-a1b4c8d51fad" containerName="oc" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.146068 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="12154ab8-bd23-418d-a6d3-a1b4c8d51fad" containerName="oc" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.146269 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da9027a-c15d-475c-b4ea-196e7b9889f5" containerName="container-00" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.146332 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="12154ab8-bd23-418d-a6d3-a1b4c8d51fad" containerName="oc" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.146992 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.149530 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.149539 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.149682 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.154873 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537874-8rbwr"] Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.159445 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b7t5\" (UniqueName: \"kubernetes.io/projected/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b-kube-api-access-6b7t5\") pod \"auto-csr-approver-29537874-8rbwr\" (UID: \"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b\") " pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.249478 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-tqhsm_87d609a5-fd9a-4473-80e4-b94dc583b438/kube-rbac-proxy/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.261407 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b7t5\" (UniqueName: \"kubernetes.io/projected/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b-kube-api-access-6b7t5\") pod \"auto-csr-approver-29537874-8rbwr\" (UID: \"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b\") " pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.287924 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b7t5\" (UniqueName: \"kubernetes.io/projected/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b-kube-api-access-6b7t5\") pod \"auto-csr-approver-29537874-8rbwr\" (UID: \"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b\") " pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.324359 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-tqhsm_87d609a5-fd9a-4473-80e4-b94dc583b438/controller/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.425000 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.471628 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.652450 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.673405 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.726376 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.741536 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.869458 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.880396 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.903107 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.914579 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 09:54:00 crc kubenswrapper[4687]: I0228 09:54:00.939698 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537874-8rbwr"] Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.116302 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.125038 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.126997 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/controller/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.319911 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.350723 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" event={"ID":"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b","Type":"ContainerStarted","Data":"cf1354b6e62b25b9a90dcbe3b0384e5a6c6623968eec81488030bd7ac1ab66b3"} Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.450751 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/kube-rbac-proxy/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.478253 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/frr-metrics/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.485779 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/kube-rbac-proxy-frr/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.662982 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/reloader/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.675682 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-qxhmg_5df08eed-eb11-482c-95aa-daebcccec8a8/frr-k8s-webhook-server/0.log" Feb 28 09:54:01 crc kubenswrapper[4687]: I0228 09:54:01.829136 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6f7cb57fd8-p9bs4_370a0b00-a4b2-428b-887b-5e0a7dce8d53/manager/0.log" Feb 28 09:54:02 crc kubenswrapper[4687]: I0228 09:54:02.024381 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-686bcc794c-5fsqb_287a3bc3-7f28-47be-90ab-6b25ea27db38/webhook-server/0.log" Feb 28 09:54:02 crc kubenswrapper[4687]: I0228 09:54:02.180559 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bnlzc_bdaf3bdc-4287-4a7c-9156-613b50d6afcc/kube-rbac-proxy/0.log" Feb 28 09:54:02 crc kubenswrapper[4687]: I0228 09:54:02.360162 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" event={"ID":"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b","Type":"ContainerStarted","Data":"8cb5c0aa8f2a9e0a1635a3d0c94db6e328ee6289e3a40686ce496591f7e3e79e"} Feb 28 09:54:02 crc kubenswrapper[4687]: I0228 09:54:02.376670 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" podStartSLOduration=1.5050519759999998 podStartE2EDuration="2.376654329s" podCreationTimestamp="2026-02-28 09:54:00 +0000 UTC" firstStartedPulling="2026-02-28 09:54:00.959819028 +0000 UTC m=+3032.650388366" lastFinishedPulling="2026-02-28 09:54:01.831421392 +0000 UTC m=+3033.521990719" observedRunningTime="2026-02-28 09:54:02.374332995 +0000 UTC m=+3034.064902323" watchObservedRunningTime="2026-02-28 09:54:02.376654329 +0000 UTC m=+3034.067223666" Feb 28 09:54:02 crc kubenswrapper[4687]: I0228 09:54:02.742080 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bnlzc_bdaf3bdc-4287-4a7c-9156-613b50d6afcc/speaker/0.log" Feb 28 09:54:02 crc kubenswrapper[4687]: I0228 09:54:02.842937 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/frr/0.log" Feb 28 09:54:03 crc kubenswrapper[4687]: I0228 09:54:03.369012 4687 generic.go:334] "Generic (PLEG): container finished" podID="fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b" containerID="8cb5c0aa8f2a9e0a1635a3d0c94db6e328ee6289e3a40686ce496591f7e3e79e" exitCode=0 Feb 28 09:54:03 crc kubenswrapper[4687]: I0228 09:54:03.369073 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" event={"ID":"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b","Type":"ContainerDied","Data":"8cb5c0aa8f2a9e0a1635a3d0c94db6e328ee6289e3a40686ce496591f7e3e79e"} Feb 28 09:54:03 crc kubenswrapper[4687]: I0228 09:54:03.656600 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:54:03 crc kubenswrapper[4687]: E0228 09:54:03.656909 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:54:04 crc kubenswrapper[4687]: I0228 09:54:04.649450 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:04 crc kubenswrapper[4687]: I0228 09:54:04.668608 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b7t5\" (UniqueName: \"kubernetes.io/projected/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b-kube-api-access-6b7t5\") pod \"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b\" (UID: \"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b\") " Feb 28 09:54:04 crc kubenswrapper[4687]: I0228 09:54:04.677719 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b-kube-api-access-6b7t5" (OuterVolumeSpecName: "kube-api-access-6b7t5") pod "fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b" (UID: "fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b"). InnerVolumeSpecName "kube-api-access-6b7t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:54:04 crc kubenswrapper[4687]: I0228 09:54:04.771411 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b7t5\" (UniqueName: \"kubernetes.io/projected/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b-kube-api-access-6b7t5\") on node \"crc\" DevicePath \"\"" Feb 28 09:54:05 crc kubenswrapper[4687]: I0228 09:54:05.390446 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" event={"ID":"fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b","Type":"ContainerDied","Data":"cf1354b6e62b25b9a90dcbe3b0384e5a6c6623968eec81488030bd7ac1ab66b3"} Feb 28 09:54:05 crc kubenswrapper[4687]: I0228 09:54:05.390522 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1354b6e62b25b9a90dcbe3b0384e5a6c6623968eec81488030bd7ac1ab66b3" Feb 28 09:54:05 crc kubenswrapper[4687]: I0228 09:54:05.390527 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537874-8rbwr" Feb 28 09:54:05 crc kubenswrapper[4687]: I0228 09:54:05.473357 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537868-5wdhp"] Feb 28 09:54:05 crc kubenswrapper[4687]: I0228 09:54:05.483133 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537868-5wdhp"] Feb 28 09:54:06 crc kubenswrapper[4687]: I0228 09:54:06.671011 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421812d0-9afe-48ff-a4e1-6909ebb201d0" path="/var/lib/kubelet/pods/421812d0-9afe-48ff-a4e1-6909ebb201d0/volumes" Feb 28 09:54:14 crc kubenswrapper[4687]: I0228 09:54:14.576633 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/util/0.log" Feb 28 09:54:14 crc kubenswrapper[4687]: I0228 09:54:14.784633 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/util/0.log" Feb 28 09:54:14 crc kubenswrapper[4687]: I0228 09:54:14.786094 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/pull/0.log" Feb 28 09:54:14 crc kubenswrapper[4687]: I0228 09:54:14.795933 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/pull/0.log" Feb 28 09:54:14 crc kubenswrapper[4687]: I0228 09:54:14.971805 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/pull/0.log" Feb 28 09:54:14 crc kubenswrapper[4687]: I0228 09:54:14.995617 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/util/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.007385 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/extract/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.165050 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-utilities/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.496679 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-utilities/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.506716 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-content/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.542405 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-content/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.615419 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-utilities/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.655792 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-content/0.log" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.656440 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:54:15 crc kubenswrapper[4687]: E0228 09:54:15.656709 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:54:15 crc kubenswrapper[4687]: I0228 09:54:15.876549 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-utilities/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.029773 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-content/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.076785 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-utilities/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.114457 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-content/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.146188 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/registry-server/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.271221 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-utilities/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.294277 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-content/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.475571 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/util/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.666751 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/util/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.691939 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/pull/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.736875 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/pull/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.862945 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/registry-server/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.913171 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/extract/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.913577 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/util/0.log" Feb 28 09:54:16 crc kubenswrapper[4687]: I0228 09:54:16.930962 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/pull/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.102221 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qhc57_e9586004-7da3-41d4-980d-825eafe37f51/marketplace-operator/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.142408 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-utilities/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.295301 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-utilities/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.320641 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-content/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.320709 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-content/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.507695 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-utilities/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.511505 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-content/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.643901 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/registry-server/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.705170 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-utilities/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.849413 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-content/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.883812 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-content/0.log" Feb 28 09:54:17 crc kubenswrapper[4687]: I0228 09:54:17.884033 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-utilities/0.log" Feb 28 09:54:18 crc kubenswrapper[4687]: I0228 09:54:18.036305 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-utilities/0.log" Feb 28 09:54:18 crc kubenswrapper[4687]: I0228 09:54:18.049156 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-content/0.log" Feb 28 09:54:18 crc kubenswrapper[4687]: I0228 09:54:18.474293 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/registry-server/0.log" Feb 28 09:54:30 crc kubenswrapper[4687]: I0228 09:54:30.658397 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:54:30 crc kubenswrapper[4687]: E0228 09:54:30.659492 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:54:43 crc kubenswrapper[4687]: I0228 09:54:43.658971 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:54:43 crc kubenswrapper[4687]: E0228 09:54:43.659782 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:54:44 crc kubenswrapper[4687]: I0228 09:54:44.055846 4687 scope.go:117] "RemoveContainer" containerID="f12a1dd6db244429dff6440cf79ed1eb28ad978284b82f66c13b53398d5692f7" Feb 28 09:54:55 crc kubenswrapper[4687]: I0228 09:54:55.657567 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:54:55 crc kubenswrapper[4687]: E0228 09:54:55.658346 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:55:10 crc kubenswrapper[4687]: I0228 09:55:10.658186 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:55:10 crc kubenswrapper[4687]: E0228 09:55:10.659065 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:55:24 crc kubenswrapper[4687]: I0228 09:55:24.657917 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:55:24 crc kubenswrapper[4687]: E0228 09:55:24.658772 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:55:38 crc kubenswrapper[4687]: I0228 09:55:38.663784 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:55:38 crc kubenswrapper[4687]: E0228 09:55:38.666163 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:55:45 crc kubenswrapper[4687]: I0228 09:55:45.313666 4687 generic.go:334] "Generic (PLEG): container finished" podID="ab705756-374f-437c-bf57-49e79e72cdc1" containerID="5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220" exitCode=0 Feb 28 09:55:45 crc kubenswrapper[4687]: I0228 09:55:45.313787 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" event={"ID":"ab705756-374f-437c-bf57-49e79e72cdc1","Type":"ContainerDied","Data":"5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220"} Feb 28 09:55:45 crc kubenswrapper[4687]: I0228 09:55:45.314740 4687 scope.go:117] "RemoveContainer" containerID="5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220" Feb 28 09:55:45 crc kubenswrapper[4687]: I0228 09:55:45.577224 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8jjxt_must-gather-rjvgr_ab705756-374f-437c-bf57-49e79e72cdc1/gather/0.log" Feb 28 09:55:49 crc kubenswrapper[4687]: I0228 09:55:49.657689 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:55:49 crc kubenswrapper[4687]: E0228 09:55:49.658763 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:55:52 crc kubenswrapper[4687]: I0228 09:55:52.864054 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8jjxt/must-gather-rjvgr"] Feb 28 09:55:52 crc kubenswrapper[4687]: I0228 09:55:52.864355 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="copy" containerID="cri-o://6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235" gracePeriod=2 Feb 28 09:55:52 crc kubenswrapper[4687]: I0228 09:55:52.877448 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8jjxt/must-gather-rjvgr"] Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.235340 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8jjxt_must-gather-rjvgr_ab705756-374f-437c-bf57-49e79e72cdc1/copy/0.log" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.236291 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.348455 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh6g8\" (UniqueName: \"kubernetes.io/projected/ab705756-374f-437c-bf57-49e79e72cdc1-kube-api-access-nh6g8\") pod \"ab705756-374f-437c-bf57-49e79e72cdc1\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.348554 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab705756-374f-437c-bf57-49e79e72cdc1-must-gather-output\") pod \"ab705756-374f-437c-bf57-49e79e72cdc1\" (UID: \"ab705756-374f-437c-bf57-49e79e72cdc1\") " Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.356537 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab705756-374f-437c-bf57-49e79e72cdc1-kube-api-access-nh6g8" (OuterVolumeSpecName: "kube-api-access-nh6g8") pod "ab705756-374f-437c-bf57-49e79e72cdc1" (UID: "ab705756-374f-437c-bf57-49e79e72cdc1"). InnerVolumeSpecName "kube-api-access-nh6g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.395460 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8jjxt_must-gather-rjvgr_ab705756-374f-437c-bf57-49e79e72cdc1/copy/0.log" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.395951 4687 generic.go:334] "Generic (PLEG): container finished" podID="ab705756-374f-437c-bf57-49e79e72cdc1" containerID="6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235" exitCode=143 Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.396044 4687 scope.go:117] "RemoveContainer" containerID="6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.396093 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jjxt/must-gather-rjvgr" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.419827 4687 scope.go:117] "RemoveContainer" containerID="5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.452678 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh6g8\" (UniqueName: \"kubernetes.io/projected/ab705756-374f-437c-bf57-49e79e72cdc1-kube-api-access-nh6g8\") on node \"crc\" DevicePath \"\"" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.473553 4687 scope.go:117] "RemoveContainer" containerID="6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235" Feb 28 09:55:53 crc kubenswrapper[4687]: E0228 09:55:53.474171 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235\": container with ID starting with 6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235 not found: ID does not exist" containerID="6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.474222 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235"} err="failed to get container status \"6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235\": rpc error: code = NotFound desc = could not find container \"6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235\": container with ID starting with 6c934c25915a93f85df61097b65e82d2720baf23725857eb0b7040117f7b8235 not found: ID does not exist" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.474268 4687 scope.go:117] "RemoveContainer" containerID="5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220" Feb 28 09:55:53 crc kubenswrapper[4687]: E0228 09:55:53.474825 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220\": container with ID starting with 5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220 not found: ID does not exist" containerID="5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.474870 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220"} err="failed to get container status \"5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220\": rpc error: code = NotFound desc = could not find container \"5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220\": container with ID starting with 5518b3187d188f445ed4e4737a57abff296ee48c264f73738613fc39c0da7220 not found: ID does not exist" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.504684 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab705756-374f-437c-bf57-49e79e72cdc1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ab705756-374f-437c-bf57-49e79e72cdc1" (UID: "ab705756-374f-437c-bf57-49e79e72cdc1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:55:53 crc kubenswrapper[4687]: I0228 09:55:53.555632 4687 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab705756-374f-437c-bf57-49e79e72cdc1-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 28 09:55:54 crc kubenswrapper[4687]: I0228 09:55:54.667838 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" path="/var/lib/kubelet/pods/ab705756-374f-437c-bf57-49e79e72cdc1/volumes" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.172394 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537876-mpsbc"] Feb 28 09:56:00 crc kubenswrapper[4687]: E0228 09:56:00.173259 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b" containerName="oc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.173274 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b" containerName="oc" Feb 28 09:56:00 crc kubenswrapper[4687]: E0228 09:56:00.173291 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="copy" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.173297 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="copy" Feb 28 09:56:00 crc kubenswrapper[4687]: E0228 09:56:00.173323 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="gather" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.173328 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="gather" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.173487 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b" containerName="oc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.173505 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="gather" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.173514 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab705756-374f-437c-bf57-49e79e72cdc1" containerName="copy" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.174146 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.176115 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.176239 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.176334 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.187677 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537876-mpsbc"] Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.220324 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8pw9\" (UniqueName: \"kubernetes.io/projected/99a97026-1a84-4969-8518-3e7ac150c55b-kube-api-access-x8pw9\") pod \"auto-csr-approver-29537876-mpsbc\" (UID: \"99a97026-1a84-4969-8518-3e7ac150c55b\") " pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.322351 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8pw9\" (UniqueName: \"kubernetes.io/projected/99a97026-1a84-4969-8518-3e7ac150c55b-kube-api-access-x8pw9\") pod \"auto-csr-approver-29537876-mpsbc\" (UID: \"99a97026-1a84-4969-8518-3e7ac150c55b\") " pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.342915 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8pw9\" (UniqueName: \"kubernetes.io/projected/99a97026-1a84-4969-8518-3e7ac150c55b-kube-api-access-x8pw9\") pod \"auto-csr-approver-29537876-mpsbc\" (UID: \"99a97026-1a84-4969-8518-3e7ac150c55b\") " pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.499829 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.902417 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537876-mpsbc"] Feb 28 09:56:00 crc kubenswrapper[4687]: I0228 09:56:00.937287 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 09:56:01 crc kubenswrapper[4687]: I0228 09:56:01.484778 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" event={"ID":"99a97026-1a84-4969-8518-3e7ac150c55b","Type":"ContainerStarted","Data":"71516160382a78b5165c8af43eaa449da060c033454cf746497f30aab598e692"} Feb 28 09:56:02 crc kubenswrapper[4687]: I0228 09:56:02.497330 4687 generic.go:334] "Generic (PLEG): container finished" podID="99a97026-1a84-4969-8518-3e7ac150c55b" containerID="09c21c93a9844a61e643e3d0510ee2dd7a178f7747aa1bcbba90038dbb011c20" exitCode=0 Feb 28 09:56:02 crc kubenswrapper[4687]: I0228 09:56:02.497454 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" event={"ID":"99a97026-1a84-4969-8518-3e7ac150c55b","Type":"ContainerDied","Data":"09c21c93a9844a61e643e3d0510ee2dd7a178f7747aa1bcbba90038dbb011c20"} Feb 28 09:56:03 crc kubenswrapper[4687]: I0228 09:56:03.657071 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:56:03 crc kubenswrapper[4687]: E0228 09:56:03.657435 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:56:03 crc kubenswrapper[4687]: I0228 09:56:03.804255 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.005262 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8pw9\" (UniqueName: \"kubernetes.io/projected/99a97026-1a84-4969-8518-3e7ac150c55b-kube-api-access-x8pw9\") pod \"99a97026-1a84-4969-8518-3e7ac150c55b\" (UID: \"99a97026-1a84-4969-8518-3e7ac150c55b\") " Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.011095 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a97026-1a84-4969-8518-3e7ac150c55b-kube-api-access-x8pw9" (OuterVolumeSpecName: "kube-api-access-x8pw9") pod "99a97026-1a84-4969-8518-3e7ac150c55b" (UID: "99a97026-1a84-4969-8518-3e7ac150c55b"). InnerVolumeSpecName "kube-api-access-x8pw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.108639 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8pw9\" (UniqueName: \"kubernetes.io/projected/99a97026-1a84-4969-8518-3e7ac150c55b-kube-api-access-x8pw9\") on node \"crc\" DevicePath \"\"" Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.519734 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" event={"ID":"99a97026-1a84-4969-8518-3e7ac150c55b","Type":"ContainerDied","Data":"71516160382a78b5165c8af43eaa449da060c033454cf746497f30aab598e692"} Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.519782 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71516160382a78b5165c8af43eaa449da060c033454cf746497f30aab598e692" Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.519814 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537876-mpsbc" Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.879469 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537870-bbqwn"] Feb 28 09:56:04 crc kubenswrapper[4687]: I0228 09:56:04.887128 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537870-bbqwn"] Feb 28 09:56:06 crc kubenswrapper[4687]: I0228 09:56:06.669413 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a908c220-334f-497e-a077-5ef0b42d1966" path="/var/lib/kubelet/pods/a908c220-334f-497e-a077-5ef0b42d1966/volumes" Feb 28 09:56:17 crc kubenswrapper[4687]: I0228 09:56:17.657870 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:56:17 crc kubenswrapper[4687]: E0228 09:56:17.659219 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:56:30 crc kubenswrapper[4687]: I0228 09:56:30.657370 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:56:30 crc kubenswrapper[4687]: E0228 09:56:30.658357 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.913426 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6jn5b"] Feb 28 09:56:39 crc kubenswrapper[4687]: E0228 09:56:39.914348 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a97026-1a84-4969-8518-3e7ac150c55b" containerName="oc" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.914364 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a97026-1a84-4969-8518-3e7ac150c55b" containerName="oc" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.914609 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a97026-1a84-4969-8518-3e7ac150c55b" containerName="oc" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.915945 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.921171 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jn5b"] Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.969754 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-utilities\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.969998 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-catalog-content\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:39 crc kubenswrapper[4687]: I0228 09:56:39.970147 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccsb8\" (UniqueName: \"kubernetes.io/projected/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-kube-api-access-ccsb8\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.071513 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-utilities\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.071737 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-catalog-content\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.071865 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccsb8\" (UniqueName: \"kubernetes.io/projected/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-kube-api-access-ccsb8\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.072205 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-utilities\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.072758 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-catalog-content\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.095689 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccsb8\" (UniqueName: \"kubernetes.io/projected/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-kube-api-access-ccsb8\") pod \"redhat-marketplace-6jn5b\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.244965 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.669912 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jn5b"] Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.886622 4687 generic.go:334] "Generic (PLEG): container finished" podID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerID="ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524" exitCode=0 Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.886678 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerDied","Data":"ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524"} Feb 28 09:56:40 crc kubenswrapper[4687]: I0228 09:56:40.886715 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerStarted","Data":"8a4371c2be8d1a9a1797ad28aaf3439db21b793f9767f2728eb40b89df75f7d3"} Feb 28 09:56:41 crc kubenswrapper[4687]: I0228 09:56:41.656874 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:56:41 crc kubenswrapper[4687]: E0228 09:56:41.657789 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:56:41 crc kubenswrapper[4687]: I0228 09:56:41.899841 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerStarted","Data":"cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890"} Feb 28 09:56:42 crc kubenswrapper[4687]: I0228 09:56:42.912703 4687 generic.go:334] "Generic (PLEG): container finished" podID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerID="cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890" exitCode=0 Feb 28 09:56:42 crc kubenswrapper[4687]: I0228 09:56:42.912798 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerDied","Data":"cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890"} Feb 28 09:56:43 crc kubenswrapper[4687]: I0228 09:56:43.928992 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerStarted","Data":"cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637"} Feb 28 09:56:43 crc kubenswrapper[4687]: I0228 09:56:43.955967 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6jn5b" podStartSLOduration=2.438458101 podStartE2EDuration="4.955947615s" podCreationTimestamp="2026-02-28 09:56:39 +0000 UTC" firstStartedPulling="2026-02-28 09:56:40.888991146 +0000 UTC m=+3192.579560484" lastFinishedPulling="2026-02-28 09:56:43.40648066 +0000 UTC m=+3195.097049998" observedRunningTime="2026-02-28 09:56:43.94865728 +0000 UTC m=+3195.639226617" watchObservedRunningTime="2026-02-28 09:56:43.955947615 +0000 UTC m=+3195.646516952" Feb 28 09:56:44 crc kubenswrapper[4687]: I0228 09:56:44.196788 4687 scope.go:117] "RemoveContainer" containerID="df4fa89dc0c4bbc7e6438e7b211fa5c065a99ef15ffc17eadc6b168db4fd2e4c" Feb 28 09:56:50 crc kubenswrapper[4687]: I0228 09:56:50.245661 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:50 crc kubenswrapper[4687]: I0228 09:56:50.246339 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:50 crc kubenswrapper[4687]: I0228 09:56:50.287914 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:51 crc kubenswrapper[4687]: I0228 09:56:51.036734 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:51 crc kubenswrapper[4687]: I0228 09:56:51.088429 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jn5b"] Feb 28 09:56:52 crc kubenswrapper[4687]: I0228 09:56:52.658085 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:56:52 crc kubenswrapper[4687]: E0228 09:56:52.659199 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.019796 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6jn5b" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="registry-server" containerID="cri-o://cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637" gracePeriod=2 Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.437770 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.503886 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-catalog-content\") pod \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.503984 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccsb8\" (UniqueName: \"kubernetes.io/projected/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-kube-api-access-ccsb8\") pod \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.504013 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-utilities\") pod \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\" (UID: \"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9\") " Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.504917 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-utilities" (OuterVolumeSpecName: "utilities") pod "7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" (UID: "7b1b3d63-47b9-4294-a98e-a9f2ef2896d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.509866 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-kube-api-access-ccsb8" (OuterVolumeSpecName: "kube-api-access-ccsb8") pod "7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" (UID: "7b1b3d63-47b9-4294-a98e-a9f2ef2896d9"). InnerVolumeSpecName "kube-api-access-ccsb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.526426 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" (UID: "7b1b3d63-47b9-4294-a98e-a9f2ef2896d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.605175 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.605213 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccsb8\" (UniqueName: \"kubernetes.io/projected/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-kube-api-access-ccsb8\") on node \"crc\" DevicePath \"\"" Feb 28 09:56:53 crc kubenswrapper[4687]: I0228 09:56:53.605227 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.040169 4687 generic.go:334] "Generic (PLEG): container finished" podID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerID="cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637" exitCode=0 Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.040243 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerDied","Data":"cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637"} Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.040497 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jn5b" event={"ID":"7b1b3d63-47b9-4294-a98e-a9f2ef2896d9","Type":"ContainerDied","Data":"8a4371c2be8d1a9a1797ad28aaf3439db21b793f9767f2728eb40b89df75f7d3"} Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.040524 4687 scope.go:117] "RemoveContainer" containerID="cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.040260 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jn5b" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.062012 4687 scope.go:117] "RemoveContainer" containerID="cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.078188 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jn5b"] Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.086533 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jn5b"] Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.096503 4687 scope.go:117] "RemoveContainer" containerID="ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.117107 4687 scope.go:117] "RemoveContainer" containerID="cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637" Feb 28 09:56:54 crc kubenswrapper[4687]: E0228 09:56:54.117507 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637\": container with ID starting with cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637 not found: ID does not exist" containerID="cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.117547 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637"} err="failed to get container status \"cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637\": rpc error: code = NotFound desc = could not find container \"cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637\": container with ID starting with cfcd0b902062cb65e768981167021b3da30b5a48e4ceb0b4ec36d6ed5be1c637 not found: ID does not exist" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.117571 4687 scope.go:117] "RemoveContainer" containerID="cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890" Feb 28 09:56:54 crc kubenswrapper[4687]: E0228 09:56:54.117860 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890\": container with ID starting with cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890 not found: ID does not exist" containerID="cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.117896 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890"} err="failed to get container status \"cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890\": rpc error: code = NotFound desc = could not find container \"cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890\": container with ID starting with cb416b250969feee8710fe8c44c67854f1c7ea7c70f0dea64d276c999129c890 not found: ID does not exist" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.117920 4687 scope.go:117] "RemoveContainer" containerID="ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524" Feb 28 09:56:54 crc kubenswrapper[4687]: E0228 09:56:54.118202 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524\": container with ID starting with ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524 not found: ID does not exist" containerID="ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.118232 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524"} err="failed to get container status \"ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524\": rpc error: code = NotFound desc = could not find container \"ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524\": container with ID starting with ad51d9f55e43841b6ff1b5dba4cb3c4dfb3cea06dfeafe9252d706f4c9139524 not found: ID does not exist" Feb 28 09:56:54 crc kubenswrapper[4687]: I0228 09:56:54.667818 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" path="/var/lib/kubelet/pods/7b1b3d63-47b9-4294-a98e-a9f2ef2896d9/volumes" Feb 28 09:57:06 crc kubenswrapper[4687]: I0228 09:57:06.657160 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 09:57:07 crc kubenswrapper[4687]: I0228 09:57:07.172397 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"7aa4c93cc379009cd173d6be3669f0744c441bb2f0f3fe73758c25336f7de5a1"} Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.146289 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537878-rxfwt"] Feb 28 09:58:00 crc kubenswrapper[4687]: E0228 09:58:00.148278 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="extract-content" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.148357 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="extract-content" Feb 28 09:58:00 crc kubenswrapper[4687]: E0228 09:58:00.148417 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="extract-utilities" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.148482 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="extract-utilities" Feb 28 09:58:00 crc kubenswrapper[4687]: E0228 09:58:00.148553 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="registry-server" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.148606 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="registry-server" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.148848 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1b3d63-47b9-4294-a98e-a9f2ef2896d9" containerName="registry-server" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.149626 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.154223 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537878-rxfwt"] Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.154830 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.155097 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.155227 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.203752 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbw7\" (UniqueName: \"kubernetes.io/projected/812e6bb9-47ab-4e7a-9376-6337b4968de0-kube-api-access-hlbw7\") pod \"auto-csr-approver-29537878-rxfwt\" (UID: \"812e6bb9-47ab-4e7a-9376-6337b4968de0\") " pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.306640 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbw7\" (UniqueName: \"kubernetes.io/projected/812e6bb9-47ab-4e7a-9376-6337b4968de0-kube-api-access-hlbw7\") pod \"auto-csr-approver-29537878-rxfwt\" (UID: \"812e6bb9-47ab-4e7a-9376-6337b4968de0\") " pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.326098 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbw7\" (UniqueName: \"kubernetes.io/projected/812e6bb9-47ab-4e7a-9376-6337b4968de0-kube-api-access-hlbw7\") pod \"auto-csr-approver-29537878-rxfwt\" (UID: \"812e6bb9-47ab-4e7a-9376-6337b4968de0\") " pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.476595 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:00 crc kubenswrapper[4687]: I0228 09:58:00.883775 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537878-rxfwt"] Feb 28 09:58:01 crc kubenswrapper[4687]: I0228 09:58:01.710442 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" event={"ID":"812e6bb9-47ab-4e7a-9376-6337b4968de0","Type":"ContainerStarted","Data":"4d04ca5862bfc4ccaf7e20ee3cf0dd49eb3340fb3830d0f1dba65ddd61ba85f3"} Feb 28 09:58:02 crc kubenswrapper[4687]: I0228 09:58:02.728831 4687 generic.go:334] "Generic (PLEG): container finished" podID="812e6bb9-47ab-4e7a-9376-6337b4968de0" containerID="7402f77759435d60461a001d49fe8652ae8b0b7ac26f0a4ac79d68967420a4d5" exitCode=0 Feb 28 09:58:02 crc kubenswrapper[4687]: I0228 09:58:02.728929 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" event={"ID":"812e6bb9-47ab-4e7a-9376-6337b4968de0","Type":"ContainerDied","Data":"7402f77759435d60461a001d49fe8652ae8b0b7ac26f0a4ac79d68967420a4d5"} Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.050245 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.190365 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlbw7\" (UniqueName: \"kubernetes.io/projected/812e6bb9-47ab-4e7a-9376-6337b4968de0-kube-api-access-hlbw7\") pod \"812e6bb9-47ab-4e7a-9376-6337b4968de0\" (UID: \"812e6bb9-47ab-4e7a-9376-6337b4968de0\") " Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.203335 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812e6bb9-47ab-4e7a-9376-6337b4968de0-kube-api-access-hlbw7" (OuterVolumeSpecName: "kube-api-access-hlbw7") pod "812e6bb9-47ab-4e7a-9376-6337b4968de0" (UID: "812e6bb9-47ab-4e7a-9376-6337b4968de0"). InnerVolumeSpecName "kube-api-access-hlbw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.294087 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlbw7\" (UniqueName: \"kubernetes.io/projected/812e6bb9-47ab-4e7a-9376-6337b4968de0-kube-api-access-hlbw7\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.753980 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" event={"ID":"812e6bb9-47ab-4e7a-9376-6337b4968de0","Type":"ContainerDied","Data":"4d04ca5862bfc4ccaf7e20ee3cf0dd49eb3340fb3830d0f1dba65ddd61ba85f3"} Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.754423 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d04ca5862bfc4ccaf7e20ee3cf0dd49eb3340fb3830d0f1dba65ddd61ba85f3" Feb 28 09:58:04 crc kubenswrapper[4687]: I0228 09:58:04.754301 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537878-rxfwt" Feb 28 09:58:05 crc kubenswrapper[4687]: I0228 09:58:05.119513 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537872-4vzzf"] Feb 28 09:58:05 crc kubenswrapper[4687]: I0228 09:58:05.128880 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537872-4vzzf"] Feb 28 09:58:06 crc kubenswrapper[4687]: I0228 09:58:06.697242 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12154ab8-bd23-418d-a6d3-a1b4c8d51fad" path="/var/lib/kubelet/pods/12154ab8-bd23-418d-a6d3-a1b4c8d51fad/volumes" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.489237 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4t7zj"] Feb 28 09:58:11 crc kubenswrapper[4687]: E0228 09:58:11.490354 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812e6bb9-47ab-4e7a-9376-6337b4968de0" containerName="oc" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.490370 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="812e6bb9-47ab-4e7a-9376-6337b4968de0" containerName="oc" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.490660 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="812e6bb9-47ab-4e7a-9376-6337b4968de0" containerName="oc" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.492256 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.497667 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t7zj"] Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.561663 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-catalog-content\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.561954 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqpn\" (UniqueName: \"kubernetes.io/projected/384acf0c-1fab-456d-8f0a-7f5ddaffda57-kube-api-access-wqqpn\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.562141 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-utilities\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.664711 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqpn\" (UniqueName: \"kubernetes.io/projected/384acf0c-1fab-456d-8f0a-7f5ddaffda57-kube-api-access-wqqpn\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.664793 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-utilities\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.664853 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-catalog-content\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.665366 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-utilities\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.665407 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-catalog-content\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.681950 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqpn\" (UniqueName: \"kubernetes.io/projected/384acf0c-1fab-456d-8f0a-7f5ddaffda57-kube-api-access-wqqpn\") pod \"certified-operators-4t7zj\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:11 crc kubenswrapper[4687]: I0228 09:58:11.820087 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:12 crc kubenswrapper[4687]: I0228 09:58:12.278286 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t7zj"] Feb 28 09:58:12 crc kubenswrapper[4687]: I0228 09:58:12.824110 4687 generic.go:334] "Generic (PLEG): container finished" podID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerID="f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4" exitCode=0 Feb 28 09:58:12 crc kubenswrapper[4687]: I0228 09:58:12.824404 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t7zj" event={"ID":"384acf0c-1fab-456d-8f0a-7f5ddaffda57","Type":"ContainerDied","Data":"f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4"} Feb 28 09:58:12 crc kubenswrapper[4687]: I0228 09:58:12.824437 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t7zj" event={"ID":"384acf0c-1fab-456d-8f0a-7f5ddaffda57","Type":"ContainerStarted","Data":"9e62830c0dd1cfa63152eb13bbbe7738f18d57aa0da96375b01faea298cc78aa"} Feb 28 09:58:13 crc kubenswrapper[4687]: I0228 09:58:13.834864 4687 generic.go:334] "Generic (PLEG): container finished" podID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerID="f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd" exitCode=0 Feb 28 09:58:13 crc kubenswrapper[4687]: I0228 09:58:13.834943 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t7zj" event={"ID":"384acf0c-1fab-456d-8f0a-7f5ddaffda57","Type":"ContainerDied","Data":"f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd"} Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.672411 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nzdtq"] Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.675216 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.686236 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzdtq"] Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.731871 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8kzw\" (UniqueName: \"kubernetes.io/projected/ad749506-e029-4d9e-96a9-69063a9c9dd0-kube-api-access-w8kzw\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.731970 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-catalog-content\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.732144 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-utilities\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.834917 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8kzw\" (UniqueName: \"kubernetes.io/projected/ad749506-e029-4d9e-96a9-69063a9c9dd0-kube-api-access-w8kzw\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.834990 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-catalog-content\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.835034 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-utilities\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.835560 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-catalog-content\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.835586 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-utilities\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.844050 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t7zj" event={"ID":"384acf0c-1fab-456d-8f0a-7f5ddaffda57","Type":"ContainerStarted","Data":"9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb"} Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.858512 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8kzw\" (UniqueName: \"kubernetes.io/projected/ad749506-e029-4d9e-96a9-69063a9c9dd0-kube-api-access-w8kzw\") pod \"redhat-operators-nzdtq\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.863089 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4t7zj" podStartSLOduration=2.396138478 podStartE2EDuration="3.863077812s" podCreationTimestamp="2026-02-28 09:58:11 +0000 UTC" firstStartedPulling="2026-02-28 09:58:12.826855991 +0000 UTC m=+3284.517425328" lastFinishedPulling="2026-02-28 09:58:14.293795325 +0000 UTC m=+3285.984364662" observedRunningTime="2026-02-28 09:58:14.859834444 +0000 UTC m=+3286.550403781" watchObservedRunningTime="2026-02-28 09:58:14.863077812 +0000 UTC m=+3286.553647138" Feb 28 09:58:14 crc kubenswrapper[4687]: I0228 09:58:14.990845 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:15 crc kubenswrapper[4687]: I0228 09:58:15.420688 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzdtq"] Feb 28 09:58:15 crc kubenswrapper[4687]: W0228 09:58:15.422513 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad749506_e029_4d9e_96a9_69063a9c9dd0.slice/crio-eb86ac7848c5670038b82a16e0047c2efc9fb2e0f6d58ac63b79979afbed3746 WatchSource:0}: Error finding container eb86ac7848c5670038b82a16e0047c2efc9fb2e0f6d58ac63b79979afbed3746: Status 404 returned error can't find the container with id eb86ac7848c5670038b82a16e0047c2efc9fb2e0f6d58ac63b79979afbed3746 Feb 28 09:58:15 crc kubenswrapper[4687]: I0228 09:58:15.854136 4687 generic.go:334] "Generic (PLEG): container finished" podID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerID="c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba" exitCode=0 Feb 28 09:58:15 crc kubenswrapper[4687]: I0228 09:58:15.854358 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerDied","Data":"c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba"} Feb 28 09:58:15 crc kubenswrapper[4687]: I0228 09:58:15.854435 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerStarted","Data":"eb86ac7848c5670038b82a16e0047c2efc9fb2e0f6d58ac63b79979afbed3746"} Feb 28 09:58:16 crc kubenswrapper[4687]: I0228 09:58:16.864386 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerStarted","Data":"d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f"} Feb 28 09:58:17 crc kubenswrapper[4687]: I0228 09:58:17.877998 4687 generic.go:334] "Generic (PLEG): container finished" podID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerID="d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f" exitCode=0 Feb 28 09:58:17 crc kubenswrapper[4687]: I0228 09:58:17.878071 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerDied","Data":"d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f"} Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.768734 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j45kw/must-gather-jqt65"] Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.770632 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.776433 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j45kw"/"openshift-service-ca.crt" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.776678 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j45kw"/"kube-root-ca.crt" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.789755 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j45kw/must-gather-jqt65"] Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.811236 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1640ed83-395f-4d74-85f2-846f87f43da0-must-gather-output\") pod \"must-gather-jqt65\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.811503 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvgcv\" (UniqueName: \"kubernetes.io/projected/1640ed83-395f-4d74-85f2-846f87f43da0-kube-api-access-gvgcv\") pod \"must-gather-jqt65\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.887720 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerStarted","Data":"7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0"} Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.911396 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nzdtq" podStartSLOduration=2.451173465 podStartE2EDuration="4.911379019s" podCreationTimestamp="2026-02-28 09:58:14 +0000 UTC" firstStartedPulling="2026-02-28 09:58:15.85622656 +0000 UTC m=+3287.546795896" lastFinishedPulling="2026-02-28 09:58:18.316432113 +0000 UTC m=+3290.007001450" observedRunningTime="2026-02-28 09:58:18.903085948 +0000 UTC m=+3290.593655286" watchObservedRunningTime="2026-02-28 09:58:18.911379019 +0000 UTC m=+3290.601948356" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.912442 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1640ed83-395f-4d74-85f2-846f87f43da0-must-gather-output\") pod \"must-gather-jqt65\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.912645 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvgcv\" (UniqueName: \"kubernetes.io/projected/1640ed83-395f-4d74-85f2-846f87f43da0-kube-api-access-gvgcv\") pod \"must-gather-jqt65\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.912951 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1640ed83-395f-4d74-85f2-846f87f43da0-must-gather-output\") pod \"must-gather-jqt65\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:18 crc kubenswrapper[4687]: I0228 09:58:18.941420 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvgcv\" (UniqueName: \"kubernetes.io/projected/1640ed83-395f-4d74-85f2-846f87f43da0-kube-api-access-gvgcv\") pod \"must-gather-jqt65\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:19 crc kubenswrapper[4687]: I0228 09:58:19.086045 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 09:58:19 crc kubenswrapper[4687]: I0228 09:58:19.509827 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j45kw/must-gather-jqt65"] Feb 28 09:58:19 crc kubenswrapper[4687]: I0228 09:58:19.898094 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/must-gather-jqt65" event={"ID":"1640ed83-395f-4d74-85f2-846f87f43da0","Type":"ContainerStarted","Data":"a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4"} Feb 28 09:58:19 crc kubenswrapper[4687]: I0228 09:58:19.898144 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/must-gather-jqt65" event={"ID":"1640ed83-395f-4d74-85f2-846f87f43da0","Type":"ContainerStarted","Data":"a0d0184ecd3a63c7d76f24ae9c8e75c30709693cf4f344783d8343a26d78bb0b"} Feb 28 09:58:20 crc kubenswrapper[4687]: I0228 09:58:20.909652 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/must-gather-jqt65" event={"ID":"1640ed83-395f-4d74-85f2-846f87f43da0","Type":"ContainerStarted","Data":"ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f"} Feb 28 09:58:20 crc kubenswrapper[4687]: I0228 09:58:20.932554 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j45kw/must-gather-jqt65" podStartSLOduration=2.932518417 podStartE2EDuration="2.932518417s" podCreationTimestamp="2026-02-28 09:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:58:20.931947713 +0000 UTC m=+3292.622517060" watchObservedRunningTime="2026-02-28 09:58:20.932518417 +0000 UTC m=+3292.623087753" Feb 28 09:58:21 crc kubenswrapper[4687]: I0228 09:58:21.820817 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:21 crc kubenswrapper[4687]: I0228 09:58:21.821289 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:21 crc kubenswrapper[4687]: I0228 09:58:21.864392 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:21 crc kubenswrapper[4687]: I0228 09:58:21.961464 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:22 crc kubenswrapper[4687]: I0228 09:58:22.950973 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j45kw/crc-debug-rqkqf"] Feb 28 09:58:22 crc kubenswrapper[4687]: I0228 09:58:22.952462 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:22 crc kubenswrapper[4687]: I0228 09:58:22.954577 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j45kw"/"default-dockercfg-zh9xg" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.004196 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97pb6\" (UniqueName: \"kubernetes.io/projected/70dfcf06-9de6-4f67-87dd-e8196b03d762-kube-api-access-97pb6\") pod \"crc-debug-rqkqf\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.004663 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70dfcf06-9de6-4f67-87dd-e8196b03d762-host\") pod \"crc-debug-rqkqf\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.066112 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4t7zj"] Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.106659 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70dfcf06-9de6-4f67-87dd-e8196b03d762-host\") pod \"crc-debug-rqkqf\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.106711 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97pb6\" (UniqueName: \"kubernetes.io/projected/70dfcf06-9de6-4f67-87dd-e8196b03d762-kube-api-access-97pb6\") pod \"crc-debug-rqkqf\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.106787 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70dfcf06-9de6-4f67-87dd-e8196b03d762-host\") pod \"crc-debug-rqkqf\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.133366 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97pb6\" (UniqueName: \"kubernetes.io/projected/70dfcf06-9de6-4f67-87dd-e8196b03d762-kube-api-access-97pb6\") pod \"crc-debug-rqkqf\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.269634 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:23 crc kubenswrapper[4687]: W0228 09:58:23.320773 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70dfcf06_9de6_4f67_87dd_e8196b03d762.slice/crio-ac959f4af12e144e7c3f9c058715bf960f6966a843835d758d373433839b881d WatchSource:0}: Error finding container ac959f4af12e144e7c3f9c058715bf960f6966a843835d758d373433839b881d: Status 404 returned error can't find the container with id ac959f4af12e144e7c3f9c058715bf960f6966a843835d758d373433839b881d Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.940650 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" event={"ID":"70dfcf06-9de6-4f67-87dd-e8196b03d762","Type":"ContainerStarted","Data":"6a188535abca0d0acd989b48e413b5a8abcba88fc2373fb5bc78084099db5c25"} Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.941097 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" event={"ID":"70dfcf06-9de6-4f67-87dd-e8196b03d762","Type":"ContainerStarted","Data":"ac959f4af12e144e7c3f9c058715bf960f6966a843835d758d373433839b881d"} Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.940816 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4t7zj" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="registry-server" containerID="cri-o://9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb" gracePeriod=2 Feb 28 09:58:23 crc kubenswrapper[4687]: I0228 09:58:23.975444 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" podStartSLOduration=1.975424753 podStartE2EDuration="1.975424753s" podCreationTimestamp="2026-02-28 09:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:58:23.9661833 +0000 UTC m=+3295.656752636" watchObservedRunningTime="2026-02-28 09:58:23.975424753 +0000 UTC m=+3295.665994090" Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.891910 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.955945 4687 generic.go:334] "Generic (PLEG): container finished" podID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerID="9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb" exitCode=0 Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.956011 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t7zj" Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.956053 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t7zj" event={"ID":"384acf0c-1fab-456d-8f0a-7f5ddaffda57","Type":"ContainerDied","Data":"9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb"} Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.956895 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t7zj" event={"ID":"384acf0c-1fab-456d-8f0a-7f5ddaffda57","Type":"ContainerDied","Data":"9e62830c0dd1cfa63152eb13bbbe7738f18d57aa0da96375b01faea298cc78aa"} Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.956928 4687 scope.go:117] "RemoveContainer" containerID="9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb" Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.991928 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.993185 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:24 crc kubenswrapper[4687]: I0228 09:58:24.994728 4687 scope.go:117] "RemoveContainer" containerID="f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.033609 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.047002 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-utilities\") pod \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.047150 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-catalog-content\") pod \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.047223 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqqpn\" (UniqueName: \"kubernetes.io/projected/384acf0c-1fab-456d-8f0a-7f5ddaffda57-kube-api-access-wqqpn\") pod \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\" (UID: \"384acf0c-1fab-456d-8f0a-7f5ddaffda57\") " Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.047622 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-utilities" (OuterVolumeSpecName: "utilities") pod "384acf0c-1fab-456d-8f0a-7f5ddaffda57" (UID: "384acf0c-1fab-456d-8f0a-7f5ddaffda57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.048039 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.053598 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384acf0c-1fab-456d-8f0a-7f5ddaffda57-kube-api-access-wqqpn" (OuterVolumeSpecName: "kube-api-access-wqqpn") pod "384acf0c-1fab-456d-8f0a-7f5ddaffda57" (UID: "384acf0c-1fab-456d-8f0a-7f5ddaffda57"). InnerVolumeSpecName "kube-api-access-wqqpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.091265 4687 scope.go:117] "RemoveContainer" containerID="f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.099986 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "384acf0c-1fab-456d-8f0a-7f5ddaffda57" (UID: "384acf0c-1fab-456d-8f0a-7f5ddaffda57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.115803 4687 scope.go:117] "RemoveContainer" containerID="9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb" Feb 28 09:58:25 crc kubenswrapper[4687]: E0228 09:58:25.116306 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb\": container with ID starting with 9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb not found: ID does not exist" containerID="9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.116351 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb"} err="failed to get container status \"9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb\": rpc error: code = NotFound desc = could not find container \"9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb\": container with ID starting with 9f3ac7e4dc2657a1e981c20a11f432104e282b1df9c2609b08f358f89ae82bfb not found: ID does not exist" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.116381 4687 scope.go:117] "RemoveContainer" containerID="f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd" Feb 28 09:58:25 crc kubenswrapper[4687]: E0228 09:58:25.116741 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd\": container with ID starting with f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd not found: ID does not exist" containerID="f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.116763 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd"} err="failed to get container status \"f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd\": rpc error: code = NotFound desc = could not find container \"f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd\": container with ID starting with f5fba2d1d40ab58cfd6dc3e06b6f434493035c48c235e6b73c1f68c274da99fd not found: ID does not exist" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.116778 4687 scope.go:117] "RemoveContainer" containerID="f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4" Feb 28 09:58:25 crc kubenswrapper[4687]: E0228 09:58:25.117152 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4\": container with ID starting with f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4 not found: ID does not exist" containerID="f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.117175 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4"} err="failed to get container status \"f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4\": rpc error: code = NotFound desc = could not find container \"f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4\": container with ID starting with f6cbeb02dc69754c4e8ae02771a4be569a1c735e43f4903a02f49706795a15c4 not found: ID does not exist" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.150690 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqqpn\" (UniqueName: \"kubernetes.io/projected/384acf0c-1fab-456d-8f0a-7f5ddaffda57-kube-api-access-wqqpn\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.150782 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/384acf0c-1fab-456d-8f0a-7f5ddaffda57-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.284063 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4t7zj"] Feb 28 09:58:25 crc kubenswrapper[4687]: I0228 09:58:25.291084 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4t7zj"] Feb 28 09:58:26 crc kubenswrapper[4687]: I0228 09:58:26.003427 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:26 crc kubenswrapper[4687]: I0228 09:58:26.665475 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" path="/var/lib/kubelet/pods/384acf0c-1fab-456d-8f0a-7f5ddaffda57/volumes" Feb 28 09:58:28 crc kubenswrapper[4687]: I0228 09:58:28.062555 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzdtq"] Feb 28 09:58:28 crc kubenswrapper[4687]: I0228 09:58:28.988582 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nzdtq" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="registry-server" containerID="cri-o://7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0" gracePeriod=2 Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.436037 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.443568 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8kzw\" (UniqueName: \"kubernetes.io/projected/ad749506-e029-4d9e-96a9-69063a9c9dd0-kube-api-access-w8kzw\") pod \"ad749506-e029-4d9e-96a9-69063a9c9dd0\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.443863 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-utilities\") pod \"ad749506-e029-4d9e-96a9-69063a9c9dd0\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.443892 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-catalog-content\") pod \"ad749506-e029-4d9e-96a9-69063a9c9dd0\" (UID: \"ad749506-e029-4d9e-96a9-69063a9c9dd0\") " Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.445783 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-utilities" (OuterVolumeSpecName: "utilities") pod "ad749506-e029-4d9e-96a9-69063a9c9dd0" (UID: "ad749506-e029-4d9e-96a9-69063a9c9dd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.448998 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad749506-e029-4d9e-96a9-69063a9c9dd0-kube-api-access-w8kzw" (OuterVolumeSpecName: "kube-api-access-w8kzw") pod "ad749506-e029-4d9e-96a9-69063a9c9dd0" (UID: "ad749506-e029-4d9e-96a9-69063a9c9dd0"). InnerVolumeSpecName "kube-api-access-w8kzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.545721 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.545757 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8kzw\" (UniqueName: \"kubernetes.io/projected/ad749506-e029-4d9e-96a9-69063a9c9dd0-kube-api-access-w8kzw\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.569781 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad749506-e029-4d9e-96a9-69063a9c9dd0" (UID: "ad749506-e029-4d9e-96a9-69063a9c9dd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.648453 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad749506-e029-4d9e-96a9-69063a9c9dd0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.999292 4687 generic.go:334] "Generic (PLEG): container finished" podID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerID="7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0" exitCode=0 Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.999371 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerDied","Data":"7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0"} Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.999600 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzdtq" event={"ID":"ad749506-e029-4d9e-96a9-69063a9c9dd0","Type":"ContainerDied","Data":"eb86ac7848c5670038b82a16e0047c2efc9fb2e0f6d58ac63b79979afbed3746"} Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.999626 4687 scope.go:117] "RemoveContainer" containerID="7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0" Feb 28 09:58:29 crc kubenswrapper[4687]: I0228 09:58:29.999404 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzdtq" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.300686 4687 scope.go:117] "RemoveContainer" containerID="d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.317073 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzdtq"] Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.325104 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nzdtq"] Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.327248 4687 scope.go:117] "RemoveContainer" containerID="c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.363200 4687 scope.go:117] "RemoveContainer" containerID="7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0" Feb 28 09:58:31 crc kubenswrapper[4687]: E0228 09:58:31.367381 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0\": container with ID starting with 7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0 not found: ID does not exist" containerID="7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.367445 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0"} err="failed to get container status \"7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0\": rpc error: code = NotFound desc = could not find container \"7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0\": container with ID starting with 7ea880d4f23bbe4b62dfa906b3bd6815993890edf363ba5f8e8d411bb726c8b0 not found: ID does not exist" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.367479 4687 scope.go:117] "RemoveContainer" containerID="d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f" Feb 28 09:58:31 crc kubenswrapper[4687]: E0228 09:58:31.367863 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f\": container with ID starting with d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f not found: ID does not exist" containerID="d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.367906 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f"} err="failed to get container status \"d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f\": rpc error: code = NotFound desc = could not find container \"d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f\": container with ID starting with d169f55a267dc943feb27306810d06355b52ad314cd035dd45e016cf4a5b8c0f not found: ID does not exist" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.367937 4687 scope.go:117] "RemoveContainer" containerID="c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba" Feb 28 09:58:31 crc kubenswrapper[4687]: E0228 09:58:31.368291 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba\": container with ID starting with c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba not found: ID does not exist" containerID="c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba" Feb 28 09:58:31 crc kubenswrapper[4687]: I0228 09:58:31.368323 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba"} err="failed to get container status \"c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba\": rpc error: code = NotFound desc = could not find container \"c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba\": container with ID starting with c9d215aac639a6a1633054af7bc8365704c83858f26cbe5841a69bed670953ba not found: ID does not exist" Feb 28 09:58:32 crc kubenswrapper[4687]: I0228 09:58:32.666356 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" path="/var/lib/kubelet/pods/ad749506-e029-4d9e-96a9-69063a9c9dd0/volumes" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.100097 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25mdv"] Feb 28 09:58:42 crc kubenswrapper[4687]: E0228 09:58:42.101146 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="extract-utilities" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101162 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="extract-utilities" Feb 28 09:58:42 crc kubenswrapper[4687]: E0228 09:58:42.101195 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="extract-content" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101201 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="extract-content" Feb 28 09:58:42 crc kubenswrapper[4687]: E0228 09:58:42.101211 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="extract-utilities" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101217 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="extract-utilities" Feb 28 09:58:42 crc kubenswrapper[4687]: E0228 09:58:42.101235 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="registry-server" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101242 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="registry-server" Feb 28 09:58:42 crc kubenswrapper[4687]: E0228 09:58:42.101256 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="extract-content" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101262 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="extract-content" Feb 28 09:58:42 crc kubenswrapper[4687]: E0228 09:58:42.101270 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="registry-server" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101275 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="registry-server" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101498 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad749506-e029-4d9e-96a9-69063a9c9dd0" containerName="registry-server" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.101527 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="384acf0c-1fab-456d-8f0a-7f5ddaffda57" containerName="registry-server" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.103111 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.123531 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25mdv"] Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.126345 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-catalog-content\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.127143 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrsh\" (UniqueName: \"kubernetes.io/projected/cd06b20d-ab02-42f0-a1d2-c26e96c51133-kube-api-access-mlrsh\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.127240 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-utilities\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.228830 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrsh\" (UniqueName: \"kubernetes.io/projected/cd06b20d-ab02-42f0-a1d2-c26e96c51133-kube-api-access-mlrsh\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.229281 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-utilities\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.229680 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-utilities\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.229754 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-catalog-content\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.229986 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-catalog-content\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.247448 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrsh\" (UniqueName: \"kubernetes.io/projected/cd06b20d-ab02-42f0-a1d2-c26e96c51133-kube-api-access-mlrsh\") pod \"community-operators-25mdv\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.422972 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:42 crc kubenswrapper[4687]: I0228 09:58:42.977011 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25mdv"] Feb 28 09:58:42 crc kubenswrapper[4687]: W0228 09:58:42.983801 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd06b20d_ab02_42f0_a1d2_c26e96c51133.slice/crio-d195ee90896fe5a8af6063500681d3acbfb5b971dcc925d8a1f8446eb5a8afb7 WatchSource:0}: Error finding container d195ee90896fe5a8af6063500681d3acbfb5b971dcc925d8a1f8446eb5a8afb7: Status 404 returned error can't find the container with id d195ee90896fe5a8af6063500681d3acbfb5b971dcc925d8a1f8446eb5a8afb7 Feb 28 09:58:43 crc kubenswrapper[4687]: I0228 09:58:43.119082 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerStarted","Data":"3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455"} Feb 28 09:58:43 crc kubenswrapper[4687]: I0228 09:58:43.119136 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerStarted","Data":"d195ee90896fe5a8af6063500681d3acbfb5b971dcc925d8a1f8446eb5a8afb7"} Feb 28 09:58:44 crc kubenswrapper[4687]: I0228 09:58:44.132159 4687 generic.go:334] "Generic (PLEG): container finished" podID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerID="3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455" exitCode=0 Feb 28 09:58:44 crc kubenswrapper[4687]: I0228 09:58:44.132288 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerDied","Data":"3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455"} Feb 28 09:58:44 crc kubenswrapper[4687]: I0228 09:58:44.323694 4687 scope.go:117] "RemoveContainer" containerID="d62de7ddf93325e029768f88eb8dcd9e33fb888a0fe194af3b5739de519a67fb" Feb 28 09:58:45 crc kubenswrapper[4687]: I0228 09:58:45.145620 4687 generic.go:334] "Generic (PLEG): container finished" podID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerID="f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86" exitCode=0 Feb 28 09:58:45 crc kubenswrapper[4687]: I0228 09:58:45.145729 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerDied","Data":"f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86"} Feb 28 09:58:46 crc kubenswrapper[4687]: I0228 09:58:46.156152 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerStarted","Data":"8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c"} Feb 28 09:58:46 crc kubenswrapper[4687]: I0228 09:58:46.173265 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25mdv" podStartSLOduration=2.663292146 podStartE2EDuration="4.173248561s" podCreationTimestamp="2026-02-28 09:58:42 +0000 UTC" firstStartedPulling="2026-02-28 09:58:44.136125436 +0000 UTC m=+3315.826694773" lastFinishedPulling="2026-02-28 09:58:45.646081851 +0000 UTC m=+3317.336651188" observedRunningTime="2026-02-28 09:58:46.171169723 +0000 UTC m=+3317.861739060" watchObservedRunningTime="2026-02-28 09:58:46.173248561 +0000 UTC m=+3317.863817898" Feb 28 09:58:50 crc kubenswrapper[4687]: I0228 09:58:50.212999 4687 generic.go:334] "Generic (PLEG): container finished" podID="70dfcf06-9de6-4f67-87dd-e8196b03d762" containerID="6a188535abca0d0acd989b48e413b5a8abcba88fc2373fb5bc78084099db5c25" exitCode=0 Feb 28 09:58:50 crc kubenswrapper[4687]: I0228 09:58:50.213076 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" event={"ID":"70dfcf06-9de6-4f67-87dd-e8196b03d762","Type":"ContainerDied","Data":"6a188535abca0d0acd989b48e413b5a8abcba88fc2373fb5bc78084099db5c25"} Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.316923 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.343256 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j45kw/crc-debug-rqkqf"] Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.350255 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j45kw/crc-debug-rqkqf"] Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.377074 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97pb6\" (UniqueName: \"kubernetes.io/projected/70dfcf06-9de6-4f67-87dd-e8196b03d762-kube-api-access-97pb6\") pod \"70dfcf06-9de6-4f67-87dd-e8196b03d762\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.377111 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70dfcf06-9de6-4f67-87dd-e8196b03d762-host\") pod \"70dfcf06-9de6-4f67-87dd-e8196b03d762\" (UID: \"70dfcf06-9de6-4f67-87dd-e8196b03d762\") " Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.377387 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70dfcf06-9de6-4f67-87dd-e8196b03d762-host" (OuterVolumeSpecName: "host") pod "70dfcf06-9de6-4f67-87dd-e8196b03d762" (UID: "70dfcf06-9de6-4f67-87dd-e8196b03d762"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.377751 4687 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70dfcf06-9de6-4f67-87dd-e8196b03d762-host\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.385640 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dfcf06-9de6-4f67-87dd-e8196b03d762-kube-api-access-97pb6" (OuterVolumeSpecName: "kube-api-access-97pb6") pod "70dfcf06-9de6-4f67-87dd-e8196b03d762" (UID: "70dfcf06-9de6-4f67-87dd-e8196b03d762"). InnerVolumeSpecName "kube-api-access-97pb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:51 crc kubenswrapper[4687]: I0228 09:58:51.479914 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97pb6\" (UniqueName: \"kubernetes.io/projected/70dfcf06-9de6-4f67-87dd-e8196b03d762-kube-api-access-97pb6\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.234838 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac959f4af12e144e7c3f9c058715bf960f6966a843835d758d373433839b881d" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.234922 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-rqkqf" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.423845 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.423898 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.467145 4687 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.500171 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j45kw/crc-debug-5vmwq"] Feb 28 09:58:52 crc kubenswrapper[4687]: E0228 09:58:52.500610 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dfcf06-9de6-4f67-87dd-e8196b03d762" containerName="container-00" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.500628 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dfcf06-9de6-4f67-87dd-e8196b03d762" containerName="container-00" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.500791 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="70dfcf06-9de6-4f67-87dd-e8196b03d762" containerName="container-00" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.501397 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.505412 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j45kw"/"default-dockercfg-zh9xg" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.604326 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxw44\" (UniqueName: \"kubernetes.io/projected/f9eca3ac-f0bd-488b-967c-8e8805227670-kube-api-access-cxw44\") pod \"crc-debug-5vmwq\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.604387 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9eca3ac-f0bd-488b-967c-8e8805227670-host\") pod \"crc-debug-5vmwq\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.666934 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dfcf06-9de6-4f67-87dd-e8196b03d762" path="/var/lib/kubelet/pods/70dfcf06-9de6-4f67-87dd-e8196b03d762/volumes" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.705776 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9eca3ac-f0bd-488b-967c-8e8805227670-host\") pod \"crc-debug-5vmwq\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.705911 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9eca3ac-f0bd-488b-967c-8e8805227670-host\") pod \"crc-debug-5vmwq\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.706959 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxw44\" (UniqueName: \"kubernetes.io/projected/f9eca3ac-f0bd-488b-967c-8e8805227670-kube-api-access-cxw44\") pod \"crc-debug-5vmwq\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.728083 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxw44\" (UniqueName: \"kubernetes.io/projected/f9eca3ac-f0bd-488b-967c-8e8805227670-kube-api-access-cxw44\") pod \"crc-debug-5vmwq\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: I0228 09:58:52.815518 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:52 crc kubenswrapper[4687]: W0228 09:58:52.849147 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9eca3ac_f0bd_488b_967c_8e8805227670.slice/crio-fbea4ba9371176330b5d0ae47e5027d90d222697c0f7edd197936b09ad22878e WatchSource:0}: Error finding container fbea4ba9371176330b5d0ae47e5027d90d222697c0f7edd197936b09ad22878e: Status 404 returned error can't find the container with id fbea4ba9371176330b5d0ae47e5027d90d222697c0f7edd197936b09ad22878e Feb 28 09:58:53 crc kubenswrapper[4687]: I0228 09:58:53.244911 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" event={"ID":"f9eca3ac-f0bd-488b-967c-8e8805227670","Type":"ContainerStarted","Data":"601b80f5ad7c1d52b0505b7e4cb2bb7165e2b6fa8fab5ce88611446019dcaa25"} Feb 28 09:58:53 crc kubenswrapper[4687]: I0228 09:58:53.245153 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" event={"ID":"f9eca3ac-f0bd-488b-967c-8e8805227670","Type":"ContainerStarted","Data":"fbea4ba9371176330b5d0ae47e5027d90d222697c0f7edd197936b09ad22878e"} Feb 28 09:58:53 crc kubenswrapper[4687]: I0228 09:58:53.259502 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" podStartSLOduration=1.259486734 podStartE2EDuration="1.259486734s" podCreationTimestamp="2026-02-28 09:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 09:58:53.254959022 +0000 UTC m=+3324.945528369" watchObservedRunningTime="2026-02-28 09:58:53.259486734 +0000 UTC m=+3324.950056071" Feb 28 09:58:53 crc kubenswrapper[4687]: I0228 09:58:53.301602 4687 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:54 crc kubenswrapper[4687]: I0228 09:58:54.259042 4687 generic.go:334] "Generic (PLEG): container finished" podID="f9eca3ac-f0bd-488b-967c-8e8805227670" containerID="601b80f5ad7c1d52b0505b7e4cb2bb7165e2b6fa8fab5ce88611446019dcaa25" exitCode=0 Feb 28 09:58:54 crc kubenswrapper[4687]: I0228 09:58:54.259128 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" event={"ID":"f9eca3ac-f0bd-488b-967c-8e8805227670","Type":"ContainerDied","Data":"601b80f5ad7c1d52b0505b7e4cb2bb7165e2b6fa8fab5ce88611446019dcaa25"} Feb 28 09:58:54 crc kubenswrapper[4687]: I0228 09:58:54.691737 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25mdv"] Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.268167 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25mdv" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="registry-server" containerID="cri-o://8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c" gracePeriod=2 Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.436836 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.465865 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j45kw/crc-debug-5vmwq"] Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.473453 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9eca3ac-f0bd-488b-967c-8e8805227670-host\") pod \"f9eca3ac-f0bd-488b-967c-8e8805227670\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.473563 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9eca3ac-f0bd-488b-967c-8e8805227670-host" (OuterVolumeSpecName: "host") pod "f9eca3ac-f0bd-488b-967c-8e8805227670" (UID: "f9eca3ac-f0bd-488b-967c-8e8805227670"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.473924 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxw44\" (UniqueName: \"kubernetes.io/projected/f9eca3ac-f0bd-488b-967c-8e8805227670-kube-api-access-cxw44\") pod \"f9eca3ac-f0bd-488b-967c-8e8805227670\" (UID: \"f9eca3ac-f0bd-488b-967c-8e8805227670\") " Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.474649 4687 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9eca3ac-f0bd-488b-967c-8e8805227670-host\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.482401 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9eca3ac-f0bd-488b-967c-8e8805227670-kube-api-access-cxw44" (OuterVolumeSpecName: "kube-api-access-cxw44") pod "f9eca3ac-f0bd-488b-967c-8e8805227670" (UID: "f9eca3ac-f0bd-488b-967c-8e8805227670"). InnerVolumeSpecName "kube-api-access-cxw44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.484409 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j45kw/crc-debug-5vmwq"] Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.576687 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxw44\" (UniqueName: \"kubernetes.io/projected/f9eca3ac-f0bd-488b-967c-8e8805227670-kube-api-access-cxw44\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.673820 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.678567 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-utilities\") pod \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.678623 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrsh\" (UniqueName: \"kubernetes.io/projected/cd06b20d-ab02-42f0-a1d2-c26e96c51133-kube-api-access-mlrsh\") pod \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.678684 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-catalog-content\") pod \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\" (UID: \"cd06b20d-ab02-42f0-a1d2-c26e96c51133\") " Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.679413 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-utilities" (OuterVolumeSpecName: "utilities") pod "cd06b20d-ab02-42f0-a1d2-c26e96c51133" (UID: "cd06b20d-ab02-42f0-a1d2-c26e96c51133"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.683166 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd06b20d-ab02-42f0-a1d2-c26e96c51133-kube-api-access-mlrsh" (OuterVolumeSpecName: "kube-api-access-mlrsh") pod "cd06b20d-ab02-42f0-a1d2-c26e96c51133" (UID: "cd06b20d-ab02-42f0-a1d2-c26e96c51133"). InnerVolumeSpecName "kube-api-access-mlrsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.730313 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd06b20d-ab02-42f0-a1d2-c26e96c51133" (UID: "cd06b20d-ab02-42f0-a1d2-c26e96c51133"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.781361 4687 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.781396 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrsh\" (UniqueName: \"kubernetes.io/projected/cd06b20d-ab02-42f0-a1d2-c26e96c51133-kube-api-access-mlrsh\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:55 crc kubenswrapper[4687]: I0228 09:58:55.781411 4687 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd06b20d-ab02-42f0-a1d2-c26e96c51133-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.277653 4687 generic.go:334] "Generic (PLEG): container finished" podID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerID="8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c" exitCode=0 Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.277737 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerDied","Data":"8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c"} Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.277786 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25mdv" event={"ID":"cd06b20d-ab02-42f0-a1d2-c26e96c51133","Type":"ContainerDied","Data":"d195ee90896fe5a8af6063500681d3acbfb5b971dcc925d8a1f8446eb5a8afb7"} Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.277806 4687 scope.go:117] "RemoveContainer" containerID="8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.277962 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25mdv" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.286348 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbea4ba9371176330b5d0ae47e5027d90d222697c0f7edd197936b09ad22878e" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.286422 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-5vmwq" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.331764 4687 scope.go:117] "RemoveContainer" containerID="f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.342639 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25mdv"] Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.350316 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25mdv"] Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.365717 4687 scope.go:117] "RemoveContainer" containerID="3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.397667 4687 scope.go:117] "RemoveContainer" containerID="8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c" Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.398390 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c\": container with ID starting with 8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c not found: ID does not exist" containerID="8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.398430 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c"} err="failed to get container status \"8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c\": rpc error: code = NotFound desc = could not find container \"8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c\": container with ID starting with 8ca2200809b85b5d948b43b2f74a312241ebdac0eccad22aef7d4aca979de75c not found: ID does not exist" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.398469 4687 scope.go:117] "RemoveContainer" containerID="f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86" Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.398776 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86\": container with ID starting with f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86 not found: ID does not exist" containerID="f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.398809 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86"} err="failed to get container status \"f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86\": rpc error: code = NotFound desc = could not find container \"f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86\": container with ID starting with f42b96f9057bd8fbf9e8f3adfc0e76604078f02f798d01d1f985c37aa02e8e86 not found: ID does not exist" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.398849 4687 scope.go:117] "RemoveContainer" containerID="3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455" Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.399101 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455\": container with ID starting with 3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455 not found: ID does not exist" containerID="3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.399145 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455"} err="failed to get container status \"3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455\": rpc error: code = NotFound desc = could not find container \"3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455\": container with ID starting with 3f10909e18b3e72c87f7721e6e9296656dc80be8c18a414d8043f0a984939455 not found: ID does not exist" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.674141 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" path="/var/lib/kubelet/pods/cd06b20d-ab02-42f0-a1d2-c26e96c51133/volumes" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.674907 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9eca3ac-f0bd-488b-967c-8e8805227670" path="/var/lib/kubelet/pods/f9eca3ac-f0bd-488b-967c-8e8805227670/volumes" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.675500 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j45kw/crc-debug-kjnk7"] Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.675843 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="registry-server" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.675860 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="registry-server" Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.675879 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="extract-content" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.675885 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="extract-content" Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.675901 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eca3ac-f0bd-488b-967c-8e8805227670" containerName="container-00" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.675906 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eca3ac-f0bd-488b-967c-8e8805227670" containerName="container-00" Feb 28 09:58:56 crc kubenswrapper[4687]: E0228 09:58:56.675933 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="extract-utilities" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.675939 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="extract-utilities" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.676984 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9eca3ac-f0bd-488b-967c-8e8805227670" containerName="container-00" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.678588 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd06b20d-ab02-42f0-a1d2-c26e96c51133" containerName="registry-server" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.679390 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.684479 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j45kw"/"default-dockercfg-zh9xg" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.824447 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f93f0f31-842e-48a0-9f96-cab7607741c0-host\") pod \"crc-debug-kjnk7\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.824924 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmk8p\" (UniqueName: \"kubernetes.io/projected/f93f0f31-842e-48a0-9f96-cab7607741c0-kube-api-access-vmk8p\") pod \"crc-debug-kjnk7\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.926524 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f93f0f31-842e-48a0-9f96-cab7607741c0-host\") pod \"crc-debug-kjnk7\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.926582 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmk8p\" (UniqueName: \"kubernetes.io/projected/f93f0f31-842e-48a0-9f96-cab7607741c0-kube-api-access-vmk8p\") pod \"crc-debug-kjnk7\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.926900 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f93f0f31-842e-48a0-9f96-cab7607741c0-host\") pod \"crc-debug-kjnk7\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:56 crc kubenswrapper[4687]: I0228 09:58:56.960597 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmk8p\" (UniqueName: \"kubernetes.io/projected/f93f0f31-842e-48a0-9f96-cab7607741c0-kube-api-access-vmk8p\") pod \"crc-debug-kjnk7\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:57 crc kubenswrapper[4687]: I0228 09:58:57.009631 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:57 crc kubenswrapper[4687]: I0228 09:58:57.296959 4687 generic.go:334] "Generic (PLEG): container finished" podID="f93f0f31-842e-48a0-9f96-cab7607741c0" containerID="b60122b114fb37cc6ec79cf89aa05c7221e4a959530eb11980c1b58ae838c62a" exitCode=0 Feb 28 09:58:57 crc kubenswrapper[4687]: I0228 09:58:57.297031 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-kjnk7" event={"ID":"f93f0f31-842e-48a0-9f96-cab7607741c0","Type":"ContainerDied","Data":"b60122b114fb37cc6ec79cf89aa05c7221e4a959530eb11980c1b58ae838c62a"} Feb 28 09:58:57 crc kubenswrapper[4687]: I0228 09:58:57.297297 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/crc-debug-kjnk7" event={"ID":"f93f0f31-842e-48a0-9f96-cab7607741c0","Type":"ContainerStarted","Data":"2af055084bde52fb33e9421f221bbf21206ff3f3323dd6d627f8af8718a85319"} Feb 28 09:58:57 crc kubenswrapper[4687]: I0228 09:58:57.335339 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j45kw/crc-debug-kjnk7"] Feb 28 09:58:57 crc kubenswrapper[4687]: I0228 09:58:57.345729 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j45kw/crc-debug-kjnk7"] Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.390792 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.458213 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f93f0f31-842e-48a0-9f96-cab7607741c0-host\") pod \"f93f0f31-842e-48a0-9f96-cab7607741c0\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.458327 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93f0f31-842e-48a0-9f96-cab7607741c0-host" (OuterVolumeSpecName: "host") pod "f93f0f31-842e-48a0-9f96-cab7607741c0" (UID: "f93f0f31-842e-48a0-9f96-cab7607741c0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.458644 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmk8p\" (UniqueName: \"kubernetes.io/projected/f93f0f31-842e-48a0-9f96-cab7607741c0-kube-api-access-vmk8p\") pod \"f93f0f31-842e-48a0-9f96-cab7607741c0\" (UID: \"f93f0f31-842e-48a0-9f96-cab7607741c0\") " Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.459201 4687 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f93f0f31-842e-48a0-9f96-cab7607741c0-host\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.464091 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f0f31-842e-48a0-9f96-cab7607741c0-kube-api-access-vmk8p" (OuterVolumeSpecName: "kube-api-access-vmk8p") pod "f93f0f31-842e-48a0-9f96-cab7607741c0" (UID: "f93f0f31-842e-48a0-9f96-cab7607741c0"). InnerVolumeSpecName "kube-api-access-vmk8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.560758 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmk8p\" (UniqueName: \"kubernetes.io/projected/f93f0f31-842e-48a0-9f96-cab7607741c0-kube-api-access-vmk8p\") on node \"crc\" DevicePath \"\"" Feb 28 09:58:58 crc kubenswrapper[4687]: I0228 09:58:58.667572 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93f0f31-842e-48a0-9f96-cab7607741c0" path="/var/lib/kubelet/pods/f93f0f31-842e-48a0-9f96-cab7607741c0/volumes" Feb 28 09:58:59 crc kubenswrapper[4687]: I0228 09:58:59.316616 4687 scope.go:117] "RemoveContainer" containerID="b60122b114fb37cc6ec79cf89aa05c7221e4a959530eb11980c1b58ae838c62a" Feb 28 09:58:59 crc kubenswrapper[4687]: I0228 09:58:59.316663 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/crc-debug-kjnk7" Feb 28 09:59:25 crc kubenswrapper[4687]: I0228 09:59:25.004122 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:59:25 crc kubenswrapper[4687]: I0228 09:59:25.008197 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.104959 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f95b8bb44-tjzcn_fa58d12c-eed3-46e2-915f-c8383b8949fe/barbican-api/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.230621 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f95b8bb44-tjzcn_fa58d12c-eed3-46e2-915f-c8383b8949fe/barbican-api-log/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.286872 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6586f4f898-ssm26_cc722f81-31b0-44eb-8206-4256e2ae12f0/barbican-keystone-listener/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.398159 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6586f4f898-ssm26_cc722f81-31b0-44eb-8206-4256e2ae12f0/barbican-keystone-listener-log/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.449563 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f58cc8c7c-dxx99_5ec85d56-f00e-4193-b4eb-ae0d43a13ffa/barbican-worker/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.519762 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f58cc8c7c-dxx99_5ec85d56-f00e-4193-b4eb-ae0d43a13ffa/barbican-worker-log/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.689595 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dkxls_e607377f-9f4c-4f40-8d5c-17487eb054b8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.723885 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/ceilometer-central-agent/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.789978 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/ceilometer-notification-agent/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.893854 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/sg-core/0.log" Feb 28 09:59:27 crc kubenswrapper[4687]: I0228 09:59:27.898501 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d031a035-5ae3-4544-9181-756dba921ef0/proxy-httpd/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.013423 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c7902e63-a118-4905-ad9d-3a4d15edce78/cinder-api/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.062451 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c7902e63-a118-4905-ad9d-3a4d15edce78/cinder-api-log/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.170494 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0e9f0b9e-618d-409d-b76f-5da56783af17/cinder-scheduler/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.237914 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0e9f0b9e-618d-409d-b76f-5da56783af17/probe/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.376334 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lxpx4_f3bbc9b7-2863-45fb-a890-fba1253b1f63/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.423083 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-x7hs5_8a947fbc-4fb5-4be7-819c-703c45480b29/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.557637 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-9dbr2_a8f2c1ae-1407-4d58-86af-05f1f1311d1a/init/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.758315 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-9dbr2_a8f2c1ae-1407-4d58-86af-05f1f1311d1a/init/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.782993 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-9dbr2_a8f2c1ae-1407-4d58-86af-05f1f1311d1a/dnsmasq-dns/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.838808 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-l6lxt_2162138d-1397-4721-adeb-73e30bf37580/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.972573 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_df7927ff-9e46-45c4-8f30-f55742dda755/glance-log/0.log" Feb 28 09:59:28 crc kubenswrapper[4687]: I0228 09:59:28.976251 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_df7927ff-9e46-45c4-8f30-f55742dda755/glance-httpd/0.log" Feb 28 09:59:29 crc kubenswrapper[4687]: I0228 09:59:29.179533 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8ab8b6d1-f4d6-4206-94a9-14e1770f672a/glance-httpd/0.log" Feb 28 09:59:29 crc kubenswrapper[4687]: I0228 09:59:29.198249 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8ab8b6d1-f4d6-4206-94a9-14e1770f672a/glance-log/0.log" Feb 28 09:59:29 crc kubenswrapper[4687]: I0228 09:59:29.484153 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9587f844-jq5pd_113841cd-f813-4ee0-93cf-2e3cfb43f6fc/horizon/0.log" Feb 28 09:59:29 crc kubenswrapper[4687]: I0228 09:59:29.552229 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zkr2k_d966dc9f-36d1-4236-8839-0f9794c0e663/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:29 crc kubenswrapper[4687]: I0228 09:59:29.738924 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gdmsq_380b1201-b6ba-48e4-b282-fad4f9b945d7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:29 crc kubenswrapper[4687]: I0228 09:59:29.841159 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9587f844-jq5pd_113841cd-f813-4ee0-93cf-2e3cfb43f6fc/horizon-log/0.log" Feb 28 09:59:30 crc kubenswrapper[4687]: I0228 09:59:30.024635 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_42ce0499-adfd-41cd-9f90-db487bc7c7a0/kube-state-metrics/0.log" Feb 28 09:59:30 crc kubenswrapper[4687]: I0228 09:59:30.026700 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8685d6f5dd-ndtlf_8fcd0fba-03d4-4584-b991-7f719e04b98d/keystone-api/0.log" Feb 28 09:59:30 crc kubenswrapper[4687]: I0228 09:59:30.182598 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ldfg9_6fb2570c-4ba8-41f6-83a3-038b8ab54177/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:30 crc kubenswrapper[4687]: I0228 09:59:30.496261 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6bd86ccc79-8jlb2_2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a/neutron-httpd/0.log" Feb 28 09:59:30 crc kubenswrapper[4687]: I0228 09:59:30.545371 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6bd86ccc79-8jlb2_2ef0b9c9-91da-4254-9ebb-3f93ff2b2b3a/neutron-api/0.log" Feb 28 09:59:30 crc kubenswrapper[4687]: I0228 09:59:30.570961 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d2sc8_29b1d03b-8788-4d8d-8105-700b9cfe905a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.101323 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a36f861b-f068-4184-bca3-ef07c5d8cec5/nova-api-log/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.222203 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7060db5b-32fc-481f-a4d6-520e585175b7/nova-cell0-conductor-conductor/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.427806 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a36f861b-f068-4184-bca3-ef07c5d8cec5/nova-api-api/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.484230 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e45dcf0c-b04a-4ae5-9488-2051b3ea91df/nova-cell1-conductor-conductor/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.581604 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_02b56b91-2ca9-4bea-b8d4-ad653daa91b8/nova-cell1-novncproxy-novncproxy/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.742151 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dv48x_b0b65af5-abae-4587-abda-dfda34ed0d0b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:31 crc kubenswrapper[4687]: I0228 09:59:31.892302 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8d4ccf04-08de-4138-ba4a-b8f5659a37fc/nova-metadata-log/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.232960 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1fe0178-db8f-44e3-9e53-a2450914080a/mysql-bootstrap/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.284348 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5586e3ed-9ec4-4c0f-9d31-57120488f2cd/nova-scheduler-scheduler/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.619309 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1fe0178-db8f-44e3-9e53-a2450914080a/mysql-bootstrap/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.626842 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1fe0178-db8f-44e3-9e53-a2450914080a/galera/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.793652 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c1fac181-ae33-45e1-8171-1d998d59bc04/mysql-bootstrap/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.904144 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8d4ccf04-08de-4138-ba4a-b8f5659a37fc/nova-metadata-metadata/0.log" Feb 28 09:59:32 crc kubenswrapper[4687]: I0228 09:59:32.968583 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c1fac181-ae33-45e1-8171-1d998d59bc04/mysql-bootstrap/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.002519 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c1fac181-ae33-45e1-8171-1d998d59bc04/galera/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.112311 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4ae7ed9a-fc3d-4dd9-b599-751ff3d8bb39/openstackclient/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.248055 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-grkmn_b7837572-8dcc-409d-b8fd-c37f2af52474/ovn-controller/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.307126 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-csrrp_d4f7bb81-e353-405c-9676-8a57d0886dae/openstack-network-exporter/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.480443 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovsdb-server-init/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.645103 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovsdb-server-init/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.649337 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovs-vswitchd/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.702381 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kbhr4_ce17423e-ccd3-4aad-9538-2424a822d5df/ovsdb-server/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.863243 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hcsz4_c1151261-c776-4190-ad84-46a4a3c68a6a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.920250 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_99f1dc2d-f77e-447b-836c-d485426a72c2/openstack-network-exporter/0.log" Feb 28 09:59:33 crc kubenswrapper[4687]: I0228 09:59:33.992263 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_99f1dc2d-f77e-447b-836c-d485426a72c2/ovn-northd/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.070035 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695/openstack-network-exporter/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.168811 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f9ed6dc4-5a44-4cc0-9bc4-9f132aae1695/ovsdbserver-nb/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.258168 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dcb66eab-811b-4162-a74b-2fc36e9e51b5/openstack-network-exporter/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.352364 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dcb66eab-811b-4162-a74b-2fc36e9e51b5/ovsdbserver-sb/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.486389 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d6696bd5b-vf747_0aa8b593-6c7b-438e-b95c-3f39081df0ea/placement-api/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.512313 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d6696bd5b-vf747_0aa8b593-6c7b-438e-b95c-3f39081df0ea/placement-log/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.554534 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_02945b48-0d0e-4c7c-8247-7b3060a6fc3c/setup-container/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.753097 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_02945b48-0d0e-4c7c-8247-7b3060a6fc3c/setup-container/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.815842 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_02945b48-0d0e-4c7c-8247-7b3060a6fc3c/rabbitmq/0.log" Feb 28 09:59:34 crc kubenswrapper[4687]: I0228 09:59:34.846244 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0af13829-a7ca-4952-8e73-2923cc70ef98/setup-container/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.066398 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0af13829-a7ca-4952-8e73-2923cc70ef98/setup-container/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.083946 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9ssnd_2bb3057f-10bb-43e9-af01-41131c5b6fb1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.189620 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0af13829-a7ca-4952-8e73-2923cc70ef98/rabbitmq/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.286985 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-lg8ps_5a7981ec-8e60-4379-af52-5188e5b53dcf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.588305 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-x97sd_bb39766e-6294-4141-be47-7a7085460449/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.694588 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bn6fw_47d00581-22fa-4c52-a057-6d757f969f52/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.799146 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-989zc_a605b600-b94d-4f23-9922-f9d8478cf6ef/ssh-known-hosts-edpm-deployment/0.log" Feb 28 09:59:35 crc kubenswrapper[4687]: I0228 09:59:35.980330 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-fdfb795c-sf6nb_10b30927-e15b-4464-b5e4-1245c90ce5f8/proxy-httpd/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.003217 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-fdfb795c-sf6nb_10b30927-e15b-4464-b5e4-1245c90ce5f8/proxy-server/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.109727 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s57nv_6cf929c8-d005-4feb-8eb4-544e89507ad9/swift-ring-rebalance/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.240214 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-reaper/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.268666 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-auditor/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.346521 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-replicator/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.436807 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/account-server/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.451242 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-auditor/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.473584 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-replicator/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.600822 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-server/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.621677 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/container-updater/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.653053 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-auditor/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.700171 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-expirer/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.791203 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-replicator/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.860121 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-server/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.905698 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/object-updater/0.log" Feb 28 09:59:36 crc kubenswrapper[4687]: I0228 09:59:36.926938 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/rsync/0.log" Feb 28 09:59:37 crc kubenswrapper[4687]: I0228 09:59:37.020862 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53dddde-f595-46a9-9764-dce250c7f5b0/swift-recon-cron/0.log" Feb 28 09:59:37 crc kubenswrapper[4687]: I0228 09:59:37.153382 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x9fgc_6f4d944c-dd63-414e-8886-5b38a982c01a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:37 crc kubenswrapper[4687]: I0228 09:59:37.206842 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e3d191c1-f8c8-455f-848c-a3d0a7caaf81/tempest-tests-tempest-tests-runner/0.log" Feb 28 09:59:37 crc kubenswrapper[4687]: I0228 09:59:37.311210 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_67a962d7-9b93-4db0-84cc-cd340793023d/test-operator-logs-container/0.log" Feb 28 09:59:37 crc kubenswrapper[4687]: I0228 09:59:37.439829 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zgthc_b83907ec-ac55-4f72-9265-e919fa57514a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 28 09:59:47 crc kubenswrapper[4687]: I0228 09:59:47.053607 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_48796fdd-f9c8-473a-b17f-c6da6d0ba3a5/memcached/0.log" Feb 28 09:59:55 crc kubenswrapper[4687]: I0228 09:59:55.003054 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 09:59:55 crc kubenswrapper[4687]: I0228 09:59:55.003738 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.147138 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537880-5dqnb"] Feb 28 10:00:00 crc kubenswrapper[4687]: E0228 10:00:00.148148 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93f0f31-842e-48a0-9f96-cab7607741c0" containerName="container-00" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.148161 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93f0f31-842e-48a0-9f96-cab7607741c0" containerName="container-00" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.148369 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93f0f31-842e-48a0-9f96-cab7607741c0" containerName="container-00" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.149122 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.153814 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.153954 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.153827 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.161630 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2"] Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.163206 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.165524 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.165723 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.182374 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2"] Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.195952 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537880-5dqnb"] Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.266236 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af541c53-6204-416a-a104-e6f6b47e5a26-secret-volume\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.266299 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bwlc\" (UniqueName: \"kubernetes.io/projected/b69e60d8-222f-4580-8723-856306017592-kube-api-access-8bwlc\") pod \"auto-csr-approver-29537880-5dqnb\" (UID: \"b69e60d8-222f-4580-8723-856306017592\") " pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.266381 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af541c53-6204-416a-a104-e6f6b47e5a26-config-volume\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.266460 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m69wb\" (UniqueName: \"kubernetes.io/projected/af541c53-6204-416a-a104-e6f6b47e5a26-kube-api-access-m69wb\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.368942 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af541c53-6204-416a-a104-e6f6b47e5a26-secret-volume\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.385280 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bwlc\" (UniqueName: \"kubernetes.io/projected/b69e60d8-222f-4580-8723-856306017592-kube-api-access-8bwlc\") pod \"auto-csr-approver-29537880-5dqnb\" (UID: \"b69e60d8-222f-4580-8723-856306017592\") " pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.385474 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af541c53-6204-416a-a104-e6f6b47e5a26-config-volume\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.385752 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m69wb\" (UniqueName: \"kubernetes.io/projected/af541c53-6204-416a-a104-e6f6b47e5a26-kube-api-access-m69wb\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.386794 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af541c53-6204-416a-a104-e6f6b47e5a26-config-volume\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.395750 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af541c53-6204-416a-a104-e6f6b47e5a26-secret-volume\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.407885 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bwlc\" (UniqueName: \"kubernetes.io/projected/b69e60d8-222f-4580-8723-856306017592-kube-api-access-8bwlc\") pod \"auto-csr-approver-29537880-5dqnb\" (UID: \"b69e60d8-222f-4580-8723-856306017592\") " pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.409982 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m69wb\" (UniqueName: \"kubernetes.io/projected/af541c53-6204-416a-a104-e6f6b47e5a26-kube-api-access-m69wb\") pod \"collect-profiles-29537880-9r8w2\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.471920 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.485551 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.637977 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/util/0.log" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.886620 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/pull/0.log" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.892483 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/util/0.log" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.924142 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/pull/0.log" Feb 28 10:00:00 crc kubenswrapper[4687]: I0228 10:00:00.949364 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2"] Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.029229 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537880-5dqnb"] Feb 28 10:00:01 crc kubenswrapper[4687]: W0228 10:00:01.042948 4687 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb69e60d8_222f_4580_8723_856306017592.slice/crio-420906f141ed4f290d8586f87c0f3cc72a41feefc0b3bd41b3e1e34222bed282 WatchSource:0}: Error finding container 420906f141ed4f290d8586f87c0f3cc72a41feefc0b3bd41b3e1e34222bed282: Status 404 returned error can't find the container with id 420906f141ed4f290d8586f87c0f3cc72a41feefc0b3bd41b3e1e34222bed282 Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.157830 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/extract/0.log" Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.164226 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/util/0.log" Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.211459 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c796774dnhf_cc095223-5798-4cc2-a762-ca92a629167c/pull/0.log" Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.593590 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-7wrs7_c3d5a3fe-4e59-43c3-aef3-33c3e7830cb1/manager/0.log" Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.866188 4687 generic.go:334] "Generic (PLEG): container finished" podID="af541c53-6204-416a-a104-e6f6b47e5a26" containerID="8d5c2655f003f2b0dd7981e8aff4f3d8008d6fe9966f5595178fd5d904e0396f" exitCode=0 Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.866596 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" event={"ID":"af541c53-6204-416a-a104-e6f6b47e5a26","Type":"ContainerDied","Data":"8d5c2655f003f2b0dd7981e8aff4f3d8008d6fe9966f5595178fd5d904e0396f"} Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.866628 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" event={"ID":"af541c53-6204-416a-a104-e6f6b47e5a26","Type":"ContainerStarted","Data":"af2989b91a55d5d1cb5d632748c0cdd2a1f44c90e2ea1722170368b9a17cda4e"} Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.873389 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" event={"ID":"b69e60d8-222f-4580-8723-856306017592","Type":"ContainerStarted","Data":"420906f141ed4f290d8586f87c0f3cc72a41feefc0b3bd41b3e1e34222bed282"} Feb 28 10:00:01 crc kubenswrapper[4687]: I0228 10:00:01.911215 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-9zkzk_30b87ec4-ee50-402d-8afc-a3f9241bbc4c/manager/0.log" Feb 28 10:00:02 crc kubenswrapper[4687]: I0228 10:00:02.043741 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-ltpvl_5945c472-0f03-4666-84ca-b8f4545db411/manager/0.log" Feb 28 10:00:02 crc kubenswrapper[4687]: I0228 10:00:02.284417 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-v9vbd_0e2af601-594d-47f7-95ef-0474051dae27/manager/0.log" Feb 28 10:00:02 crc kubenswrapper[4687]: I0228 10:00:02.680682 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-9nm28_f5b51009-d199-4b88-9158-1b7b3b1848d3/manager/0.log" Feb 28 10:00:02 crc kubenswrapper[4687]: I0228 10:00:02.809329 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-chfpl_40ae4140-3768-425a-9791-234afb6297fe/manager/0.log" Feb 28 10:00:02 crc kubenswrapper[4687]: I0228 10:00:02.919576 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-vqdm7_caa33de5-0fe2-4930-bf89-0f8ad6a96ca2/manager/0.log" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.153751 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-jw6hs_89b24774-f0eb-4d63-a124-1b244f195163/manager/0.log" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.163044 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-8r8kv_14725449-2193-4b84-b736-31c04f9f43e4/manager/0.log" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.181533 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.356003 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af541c53-6204-416a-a104-e6f6b47e5a26-secret-volume\") pod \"af541c53-6204-416a-a104-e6f6b47e5a26\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.356433 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af541c53-6204-416a-a104-e6f6b47e5a26-config-volume\") pod \"af541c53-6204-416a-a104-e6f6b47e5a26\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.356530 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m69wb\" (UniqueName: \"kubernetes.io/projected/af541c53-6204-416a-a104-e6f6b47e5a26-kube-api-access-m69wb\") pod \"af541c53-6204-416a-a104-e6f6b47e5a26\" (UID: \"af541c53-6204-416a-a104-e6f6b47e5a26\") " Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.357514 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af541c53-6204-416a-a104-e6f6b47e5a26-config-volume" (OuterVolumeSpecName: "config-volume") pod "af541c53-6204-416a-a104-e6f6b47e5a26" (UID: "af541c53-6204-416a-a104-e6f6b47e5a26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.362440 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af541c53-6204-416a-a104-e6f6b47e5a26-kube-api-access-m69wb" (OuterVolumeSpecName: "kube-api-access-m69wb") pod "af541c53-6204-416a-a104-e6f6b47e5a26" (UID: "af541c53-6204-416a-a104-e6f6b47e5a26"). InnerVolumeSpecName "kube-api-access-m69wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.373356 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af541c53-6204-416a-a104-e6f6b47e5a26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af541c53-6204-416a-a104-e6f6b47e5a26" (UID: "af541c53-6204-416a-a104-e6f6b47e5a26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.377523 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-jbzlm_a2ca8c5d-3391-4ae4-a451-8a14fe2352aa/manager/0.log" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.458693 4687 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af541c53-6204-416a-a104-e6f6b47e5a26-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.458726 4687 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af541c53-6204-416a-a104-e6f6b47e5a26-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.458737 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m69wb\" (UniqueName: \"kubernetes.io/projected/af541c53-6204-416a-a104-e6f6b47e5a26-kube-api-access-m69wb\") on node \"crc\" DevicePath \"\"" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.667227 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-hsvs9_09ff8e79-084a-4043-9061-c7007b041e86/manager/0.log" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.735907 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-kdxq5_72be3389-d521-4742-9081-8bdc3aef0dc6/manager/0.log" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.894637 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" event={"ID":"af541c53-6204-416a-a104-e6f6b47e5a26","Type":"ContainerDied","Data":"af2989b91a55d5d1cb5d632748c0cdd2a1f44c90e2ea1722170368b9a17cda4e"} Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.894681 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af2989b91a55d5d1cb5d632748c0cdd2a1f44c90e2ea1722170368b9a17cda4e" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.894754 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537880-9r8w2" Feb 28 10:00:03 crc kubenswrapper[4687]: I0228 10:00:03.949600 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-dsfvj_134bd541-e4b0-4e84-b85d-a50c413d6cd2/manager/0.log" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.093125 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b4cc4776925xf7_e1f23b9a-0cdb-4cc2-865d-49e56d8fdebe/manager/0.log" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.240572 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd"] Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.250893 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537835-9grsd"] Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.305166 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-595c94944c-4zqnh_fff03855-1690-4745-825d-919a9f9469ea/operator/0.log" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.595478 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qtbgc_c15f16ef-addd-4cba-b2c3-69b4691fa2c7/registry-server/0.log" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.664953 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e3383d8-2679-40ea-97e5-fbd106b18c91" path="/var/lib/kubelet/pods/2e3383d8-2679-40ea-97e5-fbd106b18c91/volumes" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.769292 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-9fpjj_41e8cac0-417a-4c1d-a31c-0389bdebd0ba/manager/0.log" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.802606 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-jht6f_7f019778-ba45-4e4a-a6d8-dd6d056aed3b/manager/0.log" Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.908676 4687 generic.go:334] "Generic (PLEG): container finished" podID="b69e60d8-222f-4580-8723-856306017592" containerID="4b939cda621fec5facf8469a9ac3daf05681b281920f2a4ef97b06a1d370fcd0" exitCode=0 Feb 28 10:00:04 crc kubenswrapper[4687]: I0228 10:00:04.908723 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" event={"ID":"b69e60d8-222f-4580-8723-856306017592","Type":"ContainerDied","Data":"4b939cda621fec5facf8469a9ac3daf05681b281920f2a4ef97b06a1d370fcd0"} Feb 28 10:00:05 crc kubenswrapper[4687]: I0228 10:00:05.047798 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-p64nn_da7dfebc-ad65-4d02-a7f8-c10f9a6ac0d4/operator/0.log" Feb 28 10:00:05 crc kubenswrapper[4687]: I0228 10:00:05.182202 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-q5zdg_9f7d6d86-afe8-4c99-8e5e-d81279cf5a9a/manager/0.log" Feb 28 10:00:05 crc kubenswrapper[4687]: I0228 10:00:05.326893 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-fxqv8_5ab4ce15-ddc0-4f3b-bdb0-29ce65884eaf/manager/0.log" Feb 28 10:00:05 crc kubenswrapper[4687]: I0228 10:00:05.436819 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-2t7hs_ccb38bca-46b2-4c3c-a6c5-d30af68435d1/manager/0.log" Feb 28 10:00:05 crc kubenswrapper[4687]: I0228 10:00:05.663263 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-c92d5_3ebd35dc-7a29-4c3f-b442-bfe29d833f06/manager/0.log" Feb 28 10:00:05 crc kubenswrapper[4687]: I0228 10:00:05.907554 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-864b865b94-72kg5_005ef854-8015-4724-b7b1-42f8fe9a1497/manager/0.log" Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.247954 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.415974 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bwlc\" (UniqueName: \"kubernetes.io/projected/b69e60d8-222f-4580-8723-856306017592-kube-api-access-8bwlc\") pod \"b69e60d8-222f-4580-8723-856306017592\" (UID: \"b69e60d8-222f-4580-8723-856306017592\") " Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.428182 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69e60d8-222f-4580-8723-856306017592-kube-api-access-8bwlc" (OuterVolumeSpecName: "kube-api-access-8bwlc") pod "b69e60d8-222f-4580-8723-856306017592" (UID: "b69e60d8-222f-4580-8723-856306017592"). InnerVolumeSpecName "kube-api-access-8bwlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.519246 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bwlc\" (UniqueName: \"kubernetes.io/projected/b69e60d8-222f-4580-8723-856306017592-kube-api-access-8bwlc\") on node \"crc\" DevicePath \"\"" Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.933359 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" event={"ID":"b69e60d8-222f-4580-8723-856306017592","Type":"ContainerDied","Data":"420906f141ed4f290d8586f87c0f3cc72a41feefc0b3bd41b3e1e34222bed282"} Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.933658 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="420906f141ed4f290d8586f87c0f3cc72a41feefc0b3bd41b3e1e34222bed282" Feb 28 10:00:06 crc kubenswrapper[4687]: I0228 10:00:06.933723 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537880-5dqnb" Feb 28 10:00:07 crc kubenswrapper[4687]: I0228 10:00:07.312923 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537874-8rbwr"] Feb 28 10:00:07 crc kubenswrapper[4687]: I0228 10:00:07.325957 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537874-8rbwr"] Feb 28 10:00:07 crc kubenswrapper[4687]: I0228 10:00:07.540337 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-jtdtt_dc30956e-12c6-4973-a99f-ae4b502abb17/manager/0.log" Feb 28 10:00:08 crc kubenswrapper[4687]: I0228 10:00:08.665710 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b" path="/var/lib/kubelet/pods/fe98ecf8-2fd8-4f24-a3b7-6fc0e691a26b/volumes" Feb 28 10:00:22 crc kubenswrapper[4687]: I0228 10:00:22.627683 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-494jw_7461d892-4781-495c-b78f-5fe375ed4f44/control-plane-machine-set-operator/0.log" Feb 28 10:00:22 crc kubenswrapper[4687]: I0228 10:00:22.773709 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9thbt_9292d86c-b9c1-4a63-a766-c25874ffa2f5/kube-rbac-proxy/0.log" Feb 28 10:00:22 crc kubenswrapper[4687]: I0228 10:00:22.805657 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9thbt_9292d86c-b9c1-4a63-a766-c25874ffa2f5/machine-api-operator/0.log" Feb 28 10:00:25 crc kubenswrapper[4687]: I0228 10:00:25.002757 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 10:00:25 crc kubenswrapper[4687]: I0228 10:00:25.003373 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 10:00:25 crc kubenswrapper[4687]: I0228 10:00:25.003427 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 10:00:25 crc kubenswrapper[4687]: I0228 10:00:25.004312 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7aa4c93cc379009cd173d6be3669f0744c441bb2f0f3fe73758c25336f7de5a1"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 10:00:25 crc kubenswrapper[4687]: I0228 10:00:25.004361 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://7aa4c93cc379009cd173d6be3669f0744c441bb2f0f3fe73758c25336f7de5a1" gracePeriod=600 Feb 28 10:00:26 crc kubenswrapper[4687]: I0228 10:00:26.117535 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="7aa4c93cc379009cd173d6be3669f0744c441bb2f0f3fe73758c25336f7de5a1" exitCode=0 Feb 28 10:00:26 crc kubenswrapper[4687]: I0228 10:00:26.117628 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"7aa4c93cc379009cd173d6be3669f0744c441bb2f0f3fe73758c25336f7de5a1"} Feb 28 10:00:26 crc kubenswrapper[4687]: I0228 10:00:26.118134 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerStarted","Data":"5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200"} Feb 28 10:00:26 crc kubenswrapper[4687]: I0228 10:00:26.118194 4687 scope.go:117] "RemoveContainer" containerID="09bebc0f5946daa8db36d82105561afe9655d1a2881b438927e068724427e287" Feb 28 10:00:33 crc kubenswrapper[4687]: I0228 10:00:33.919985 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jrbfz_5b4222a9-1f7a-48de-879a-4c5dc9d4d99d/cert-manager-controller/0.log" Feb 28 10:00:34 crc kubenswrapper[4687]: I0228 10:00:34.069805 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-gg5lw_45a33d4f-01db-48af-aa18-b0a18834a9ab/cert-manager-cainjector/0.log" Feb 28 10:00:34 crc kubenswrapper[4687]: I0228 10:00:34.122730 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-h6sww_42c2a835-9620-4ed3-8dc5-dbe24b201af7/cert-manager-webhook/0.log" Feb 28 10:00:44 crc kubenswrapper[4687]: I0228 10:00:44.494424 4687 scope.go:117] "RemoveContainer" containerID="447e228198642fa20aab6fcd3d719ed69156fd00567e259cc4619d3b355f03a5" Feb 28 10:00:44 crc kubenswrapper[4687]: I0228 10:00:44.523890 4687 scope.go:117] "RemoveContainer" containerID="8cb5c0aa8f2a9e0a1635a3d0c94db6e328ee6289e3a40686ce496591f7e3e79e" Feb 28 10:00:45 crc kubenswrapper[4687]: I0228 10:00:45.075042 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-pq26r_fe724a47-e6db-4940-885f-318abb45fb46/nmstate-console-plugin/0.log" Feb 28 10:00:45 crc kubenswrapper[4687]: I0228 10:00:45.249598 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6lptp_02027bc3-0840-49ed-afe6-13d5285bdff9/nmstate-handler/0.log" Feb 28 10:00:45 crc kubenswrapper[4687]: I0228 10:00:45.291871 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-c66wz_88c47658-dd20-4f97-b063-b95f5bd2d79d/kube-rbac-proxy/0.log" Feb 28 10:00:45 crc kubenswrapper[4687]: I0228 10:00:45.392783 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-c66wz_88c47658-dd20-4f97-b063-b95f5bd2d79d/nmstate-metrics/0.log" Feb 28 10:00:45 crc kubenswrapper[4687]: I0228 10:00:45.435741 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-5sb42_8da54ae4-877e-4e38-890c-8eabef7c7033/nmstate-operator/0.log" Feb 28 10:00:45 crc kubenswrapper[4687]: I0228 10:00:45.581007 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-8kg5p_5dc19058-cbce-4742-9c1f-11005a9aefbf/nmstate-webhook/0.log" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.147348 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29537881-hxvz8"] Feb 28 10:01:00 crc kubenswrapper[4687]: E0228 10:01:00.148378 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69e60d8-222f-4580-8723-856306017592" containerName="oc" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.148393 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69e60d8-222f-4580-8723-856306017592" containerName="oc" Feb 28 10:01:00 crc kubenswrapper[4687]: E0228 10:01:00.148420 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af541c53-6204-416a-a104-e6f6b47e5a26" containerName="collect-profiles" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.148426 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="af541c53-6204-416a-a104-e6f6b47e5a26" containerName="collect-profiles" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.148626 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="af541c53-6204-416a-a104-e6f6b47e5a26" containerName="collect-profiles" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.148635 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69e60d8-222f-4580-8723-856306017592" containerName="oc" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.149364 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.158733 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29537881-hxvz8"] Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.208171 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-fernet-keys\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.208272 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-config-data\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.208627 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k4fk\" (UniqueName: \"kubernetes.io/projected/cf28a688-f2d4-4283-8828-f239a03d9029-kube-api-access-7k4fk\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.208796 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-combined-ca-bundle\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.311149 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k4fk\" (UniqueName: \"kubernetes.io/projected/cf28a688-f2d4-4283-8828-f239a03d9029-kube-api-access-7k4fk\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.311239 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-combined-ca-bundle\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.311293 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-fernet-keys\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.311347 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-config-data\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.320415 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-combined-ca-bundle\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.320514 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-fernet-keys\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.320675 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-config-data\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.327806 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k4fk\" (UniqueName: \"kubernetes.io/projected/cf28a688-f2d4-4283-8828-f239a03d9029-kube-api-access-7k4fk\") pod \"keystone-cron-29537881-hxvz8\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.470421 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:00 crc kubenswrapper[4687]: I0228 10:01:00.874829 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29537881-hxvz8"] Feb 28 10:01:01 crc kubenswrapper[4687]: I0228 10:01:01.453163 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29537881-hxvz8" event={"ID":"cf28a688-f2d4-4283-8828-f239a03d9029","Type":"ContainerStarted","Data":"a62a5e66e7b4963ac1e3a53bf75f14945f660b19bdc9655c205278c5333f0a2e"} Feb 28 10:01:01 crc kubenswrapper[4687]: I0228 10:01:01.453503 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29537881-hxvz8" event={"ID":"cf28a688-f2d4-4283-8828-f239a03d9029","Type":"ContainerStarted","Data":"b079abc3389c350260e17d88c782c1efbca97d57aa3ce6018f9487d4611a1765"} Feb 28 10:01:01 crc kubenswrapper[4687]: I0228 10:01:01.474521 4687 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29537881-hxvz8" podStartSLOduration=1.474497717 podStartE2EDuration="1.474497717s" podCreationTimestamp="2026-02-28 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 10:01:01.464985884 +0000 UTC m=+3453.155555222" watchObservedRunningTime="2026-02-28 10:01:01.474497717 +0000 UTC m=+3453.165067054" Feb 28 10:01:03 crc kubenswrapper[4687]: I0228 10:01:03.475084 4687 generic.go:334] "Generic (PLEG): container finished" podID="cf28a688-f2d4-4283-8828-f239a03d9029" containerID="a62a5e66e7b4963ac1e3a53bf75f14945f660b19bdc9655c205278c5333f0a2e" exitCode=0 Feb 28 10:01:03 crc kubenswrapper[4687]: I0228 10:01:03.475171 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29537881-hxvz8" event={"ID":"cf28a688-f2d4-4283-8828-f239a03d9029","Type":"ContainerDied","Data":"a62a5e66e7b4963ac1e3a53bf75f14945f660b19bdc9655c205278c5333f0a2e"} Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.772357 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.900469 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-combined-ca-bundle\") pod \"cf28a688-f2d4-4283-8828-f239a03d9029\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.901226 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k4fk\" (UniqueName: \"kubernetes.io/projected/cf28a688-f2d4-4283-8828-f239a03d9029-kube-api-access-7k4fk\") pod \"cf28a688-f2d4-4283-8828-f239a03d9029\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.901273 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-config-data\") pod \"cf28a688-f2d4-4283-8828-f239a03d9029\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.901382 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-fernet-keys\") pod \"cf28a688-f2d4-4283-8828-f239a03d9029\" (UID: \"cf28a688-f2d4-4283-8828-f239a03d9029\") " Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.908736 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf28a688-f2d4-4283-8828-f239a03d9029" (UID: "cf28a688-f2d4-4283-8828-f239a03d9029"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.908868 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf28a688-f2d4-4283-8828-f239a03d9029-kube-api-access-7k4fk" (OuterVolumeSpecName: "kube-api-access-7k4fk") pod "cf28a688-f2d4-4283-8828-f239a03d9029" (UID: "cf28a688-f2d4-4283-8828-f239a03d9029"). InnerVolumeSpecName "kube-api-access-7k4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.929109 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf28a688-f2d4-4283-8828-f239a03d9029" (UID: "cf28a688-f2d4-4283-8828-f239a03d9029"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 10:01:04 crc kubenswrapper[4687]: I0228 10:01:04.963645 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-config-data" (OuterVolumeSpecName: "config-data") pod "cf28a688-f2d4-4283-8828-f239a03d9029" (UID: "cf28a688-f2d4-4283-8828-f239a03d9029"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.006097 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k4fk\" (UniqueName: \"kubernetes.io/projected/cf28a688-f2d4-4283-8828-f239a03d9029-kube-api-access-7k4fk\") on node \"crc\" DevicePath \"\"" Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.006140 4687 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.006154 4687 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.006165 4687 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf28a688-f2d4-4283-8828-f239a03d9029-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.494003 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29537881-hxvz8" event={"ID":"cf28a688-f2d4-4283-8828-f239a03d9029","Type":"ContainerDied","Data":"b079abc3389c350260e17d88c782c1efbca97d57aa3ce6018f9487d4611a1765"} Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.494057 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29537881-hxvz8" Feb 28 10:01:05 crc kubenswrapper[4687]: I0228 10:01:05.494063 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b079abc3389c350260e17d88c782c1efbca97d57aa3ce6018f9487d4611a1765" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.265944 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-tqhsm_87d609a5-fd9a-4473-80e4-b94dc583b438/kube-rbac-proxy/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.313533 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-tqhsm_87d609a5-fd9a-4473-80e4-b94dc583b438/controller/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.459862 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.602423 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.610073 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.616318 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.649505 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.749594 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.799783 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.813683 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 10:01:09 crc kubenswrapper[4687]: I0228 10:01:09.826628 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.031087 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-reloader/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.042592 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-metrics/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.069276 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/cp-frr-files/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.074842 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/controller/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.216217 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/frr-metrics/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.234886 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/kube-rbac-proxy/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.250336 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/kube-rbac-proxy-frr/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.468512 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/reloader/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.491168 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-qxhmg_5df08eed-eb11-482c-95aa-daebcccec8a8/frr-k8s-webhook-server/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.727045 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6f7cb57fd8-p9bs4_370a0b00-a4b2-428b-887b-5e0a7dce8d53/manager/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.905291 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bnlzc_bdaf3bdc-4287-4a7c-9156-613b50d6afcc/kube-rbac-proxy/0.log" Feb 28 10:01:10 crc kubenswrapper[4687]: I0228 10:01:10.926043 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-686bcc794c-5fsqb_287a3bc3-7f28-47be-90ab-6b25ea27db38/webhook-server/0.log" Feb 28 10:01:11 crc kubenswrapper[4687]: I0228 10:01:11.505097 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bnlzc_bdaf3bdc-4287-4a7c-9156-613b50d6afcc/speaker/0.log" Feb 28 10:01:11 crc kubenswrapper[4687]: I0228 10:01:11.767621 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lnfnj_abbd4948-5005-4b4b-b0eb-de72a0b28860/frr/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.175543 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/util/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.373941 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/util/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.406895 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/pull/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.434648 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/pull/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.591517 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/util/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.605453 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/pull/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.610194 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82sqxhd_d5bd06a9-5b96-437f-a148-91f7d90e1f00/extract/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.772141 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-utilities/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.902977 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-utilities/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.931600 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-content/0.log" Feb 28 10:01:23 crc kubenswrapper[4687]: I0228 10:01:23.973772 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-content/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.092223 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-content/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.132466 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/extract-utilities/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.342705 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-utilities/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.531147 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-content/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.574628 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-utilities/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.582534 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-content/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.729531 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ljbw_5d6d3c50-a212-411b-9c51-4ea3b3fee060/registry-server/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.765271 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-utilities/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.813165 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/extract-content/0.log" Feb 28 10:01:24 crc kubenswrapper[4687]: I0228 10:01:24.986568 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/util/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.193932 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/util/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.250403 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/pull/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.261125 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/pull/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.397872 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99h7q_43862b0c-fb60-45f2-b4bd-0e09864292a9/registry-server/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.597651 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/pull/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.599452 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/util/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.660761 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kgxk7_993f721e-f5f5-4e7e-9896-5931bd6e0023/extract/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.760106 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qhc57_e9586004-7da3-41d4-980d-825eafe37f51/marketplace-operator/0.log" Feb 28 10:01:25 crc kubenswrapper[4687]: I0228 10:01:25.871974 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-utilities/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.004678 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-content/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.010544 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-utilities/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.022280 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-content/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.183536 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-utilities/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.209681 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/extract-content/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.319057 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dcllh_1c55d393-9095-4638-b5d0-d6dd60859eb8/registry-server/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.391304 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-utilities/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.543786 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-content/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.568397 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-utilities/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.605071 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-content/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.722368 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-utilities/0.log" Feb 28 10:01:26 crc kubenswrapper[4687]: I0228 10:01:26.770298 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/extract-content/0.log" Feb 28 10:01:27 crc kubenswrapper[4687]: I0228 10:01:27.263074 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xssb6_12de48e8-809e-43e9-827f-28ce52d796e8/registry-server/0.log" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.148237 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537882-p72cj"] Feb 28 10:02:00 crc kubenswrapper[4687]: E0228 10:02:00.149354 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28a688-f2d4-4283-8828-f239a03d9029" containerName="keystone-cron" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.149371 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28a688-f2d4-4283-8828-f239a03d9029" containerName="keystone-cron" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.149586 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf28a688-f2d4-4283-8828-f239a03d9029" containerName="keystone-cron" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.150394 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.152186 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.152724 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.153009 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.158303 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537882-p72cj"] Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.177084 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ljct\" (UniqueName: \"kubernetes.io/projected/2eaa295e-8638-44ef-907d-ac9a0f19791b-kube-api-access-5ljct\") pod \"auto-csr-approver-29537882-p72cj\" (UID: \"2eaa295e-8638-44ef-907d-ac9a0f19791b\") " pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.278213 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ljct\" (UniqueName: \"kubernetes.io/projected/2eaa295e-8638-44ef-907d-ac9a0f19791b-kube-api-access-5ljct\") pod \"auto-csr-approver-29537882-p72cj\" (UID: \"2eaa295e-8638-44ef-907d-ac9a0f19791b\") " pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.294858 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ljct\" (UniqueName: \"kubernetes.io/projected/2eaa295e-8638-44ef-907d-ac9a0f19791b-kube-api-access-5ljct\") pod \"auto-csr-approver-29537882-p72cj\" (UID: \"2eaa295e-8638-44ef-907d-ac9a0f19791b\") " pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:00 crc kubenswrapper[4687]: I0228 10:02:00.468844 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:01 crc kubenswrapper[4687]: I0228 10:02:01.017674 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537882-p72cj"] Feb 28 10:02:01 crc kubenswrapper[4687]: I0228 10:02:01.021941 4687 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 10:02:02 crc kubenswrapper[4687]: I0228 10:02:02.003700 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537882-p72cj" event={"ID":"2eaa295e-8638-44ef-907d-ac9a0f19791b","Type":"ContainerStarted","Data":"c305155c4e31965f9e394f49cca59d0b427c205e6806bef9173a1b7daaa0db60"} Feb 28 10:02:03 crc kubenswrapper[4687]: I0228 10:02:03.016540 4687 generic.go:334] "Generic (PLEG): container finished" podID="2eaa295e-8638-44ef-907d-ac9a0f19791b" containerID="26d4621e9404797d7b5bd398af4c294b630aa30fe141f0064ab26806d7b1532f" exitCode=0 Feb 28 10:02:03 crc kubenswrapper[4687]: I0228 10:02:03.016654 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537882-p72cj" event={"ID":"2eaa295e-8638-44ef-907d-ac9a0f19791b","Type":"ContainerDied","Data":"26d4621e9404797d7b5bd398af4c294b630aa30fe141f0064ab26806d7b1532f"} Feb 28 10:02:04 crc kubenswrapper[4687]: I0228 10:02:04.332802 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:04 crc kubenswrapper[4687]: I0228 10:02:04.463236 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ljct\" (UniqueName: \"kubernetes.io/projected/2eaa295e-8638-44ef-907d-ac9a0f19791b-kube-api-access-5ljct\") pod \"2eaa295e-8638-44ef-907d-ac9a0f19791b\" (UID: \"2eaa295e-8638-44ef-907d-ac9a0f19791b\") " Feb 28 10:02:04 crc kubenswrapper[4687]: I0228 10:02:04.470243 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eaa295e-8638-44ef-907d-ac9a0f19791b-kube-api-access-5ljct" (OuterVolumeSpecName: "kube-api-access-5ljct") pod "2eaa295e-8638-44ef-907d-ac9a0f19791b" (UID: "2eaa295e-8638-44ef-907d-ac9a0f19791b"). InnerVolumeSpecName "kube-api-access-5ljct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 10:02:04 crc kubenswrapper[4687]: I0228 10:02:04.564931 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ljct\" (UniqueName: \"kubernetes.io/projected/2eaa295e-8638-44ef-907d-ac9a0f19791b-kube-api-access-5ljct\") on node \"crc\" DevicePath \"\"" Feb 28 10:02:05 crc kubenswrapper[4687]: I0228 10:02:05.040314 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537882-p72cj" event={"ID":"2eaa295e-8638-44ef-907d-ac9a0f19791b","Type":"ContainerDied","Data":"c305155c4e31965f9e394f49cca59d0b427c205e6806bef9173a1b7daaa0db60"} Feb 28 10:02:05 crc kubenswrapper[4687]: I0228 10:02:05.040373 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c305155c4e31965f9e394f49cca59d0b427c205e6806bef9173a1b7daaa0db60" Feb 28 10:02:05 crc kubenswrapper[4687]: I0228 10:02:05.040387 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537882-p72cj" Feb 28 10:02:05 crc kubenswrapper[4687]: I0228 10:02:05.403865 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537876-mpsbc"] Feb 28 10:02:05 crc kubenswrapper[4687]: I0228 10:02:05.411604 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537876-mpsbc"] Feb 28 10:02:06 crc kubenswrapper[4687]: I0228 10:02:06.671133 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a97026-1a84-4969-8518-3e7ac150c55b" path="/var/lib/kubelet/pods/99a97026-1a84-4969-8518-3e7ac150c55b/volumes" Feb 28 10:02:25 crc kubenswrapper[4687]: I0228 10:02:25.002601 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 10:02:25 crc kubenswrapper[4687]: I0228 10:02:25.003267 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 10:02:44 crc kubenswrapper[4687]: I0228 10:02:44.668568 4687 scope.go:117] "RemoveContainer" containerID="09c21c93a9844a61e643e3d0510ee2dd7a178f7747aa1bcbba90038dbb011c20" Feb 28 10:02:51 crc kubenswrapper[4687]: I0228 10:02:51.585082 4687 generic.go:334] "Generic (PLEG): container finished" podID="1640ed83-395f-4d74-85f2-846f87f43da0" containerID="a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4" exitCode=0 Feb 28 10:02:51 crc kubenswrapper[4687]: I0228 10:02:51.585187 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45kw/must-gather-jqt65" event={"ID":"1640ed83-395f-4d74-85f2-846f87f43da0","Type":"ContainerDied","Data":"a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4"} Feb 28 10:02:51 crc kubenswrapper[4687]: I0228 10:02:51.586342 4687 scope.go:117] "RemoveContainer" containerID="a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4" Feb 28 10:02:52 crc kubenswrapper[4687]: I0228 10:02:52.633136 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j45kw_must-gather-jqt65_1640ed83-395f-4d74-85f2-846f87f43da0/gather/0.log" Feb 28 10:02:55 crc kubenswrapper[4687]: I0228 10:02:55.002083 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 10:02:55 crc kubenswrapper[4687]: I0228 10:02:55.003837 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.090348 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j45kw/must-gather-jqt65"] Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.091195 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-j45kw/must-gather-jqt65" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="copy" containerID="cri-o://ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f" gracePeriod=2 Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.097715 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j45kw/must-gather-jqt65"] Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.514115 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j45kw_must-gather-jqt65_1640ed83-395f-4d74-85f2-846f87f43da0/copy/0.log" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.514911 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.601174 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1640ed83-395f-4d74-85f2-846f87f43da0-must-gather-output\") pod \"1640ed83-395f-4d74-85f2-846f87f43da0\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.601285 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvgcv\" (UniqueName: \"kubernetes.io/projected/1640ed83-395f-4d74-85f2-846f87f43da0-kube-api-access-gvgcv\") pod \"1640ed83-395f-4d74-85f2-846f87f43da0\" (UID: \"1640ed83-395f-4d74-85f2-846f87f43da0\") " Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.611174 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1640ed83-395f-4d74-85f2-846f87f43da0-kube-api-access-gvgcv" (OuterVolumeSpecName: "kube-api-access-gvgcv") pod "1640ed83-395f-4d74-85f2-846f87f43da0" (UID: "1640ed83-395f-4d74-85f2-846f87f43da0"). InnerVolumeSpecName "kube-api-access-gvgcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.704317 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvgcv\" (UniqueName: \"kubernetes.io/projected/1640ed83-395f-4d74-85f2-846f87f43da0-kube-api-access-gvgcv\") on node \"crc\" DevicePath \"\"" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.715495 4687 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j45kw_must-gather-jqt65_1640ed83-395f-4d74-85f2-846f87f43da0/copy/0.log" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.716074 4687 generic.go:334] "Generic (PLEG): container finished" podID="1640ed83-395f-4d74-85f2-846f87f43da0" containerID="ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f" exitCode=143 Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.716125 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45kw/must-gather-jqt65" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.716136 4687 scope.go:117] "RemoveContainer" containerID="ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.738133 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640ed83-395f-4d74-85f2-846f87f43da0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1640ed83-395f-4d74-85f2-846f87f43da0" (UID: "1640ed83-395f-4d74-85f2-846f87f43da0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.742232 4687 scope.go:117] "RemoveContainer" containerID="a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.792051 4687 scope.go:117] "RemoveContainer" containerID="ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f" Feb 28 10:03:03 crc kubenswrapper[4687]: E0228 10:03:03.792577 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f\": container with ID starting with ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f not found: ID does not exist" containerID="ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.792611 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f"} err="failed to get container status \"ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f\": rpc error: code = NotFound desc = could not find container \"ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f\": container with ID starting with ad08b4ec977f616161821cf226b32c184869495faa546ae6d58ad8c2762ed00f not found: ID does not exist" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.792635 4687 scope.go:117] "RemoveContainer" containerID="a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4" Feb 28 10:03:03 crc kubenswrapper[4687]: E0228 10:03:03.792894 4687 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4\": container with ID starting with a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4 not found: ID does not exist" containerID="a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.792916 4687 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4"} err="failed to get container status \"a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4\": rpc error: code = NotFound desc = could not find container \"a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4\": container with ID starting with a73b3dc6f9bbd290dd9f20e246e5a7131d2bda1e2ca283aff89653f5a75b5af4 not found: ID does not exist" Feb 28 10:03:03 crc kubenswrapper[4687]: I0228 10:03:03.808081 4687 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1640ed83-395f-4d74-85f2-846f87f43da0-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 28 10:03:04 crc kubenswrapper[4687]: I0228 10:03:04.668514 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" path="/var/lib/kubelet/pods/1640ed83-395f-4d74-85f2-846f87f43da0/volumes" Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.002324 4687 patch_prober.go:28] interesting pod/machine-config-daemon-sbkqn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.003124 4687 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.003211 4687 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.004180 4687 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200"} pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.004258 4687 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerName="machine-config-daemon" containerID="cri-o://5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" gracePeriod=600 Feb 28 10:03:25 crc kubenswrapper[4687]: E0228 10:03:25.124978 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.930528 4687 generic.go:334] "Generic (PLEG): container finished" podID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" exitCode=0 Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.930590 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" event={"ID":"dcd48dfa-192a-4a5b-be30-fc7eebc90da1","Type":"ContainerDied","Data":"5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200"} Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.932584 4687 scope.go:117] "RemoveContainer" containerID="7aa4c93cc379009cd173d6be3669f0744c441bb2f0f3fe73758c25336f7de5a1" Feb 28 10:03:25 crc kubenswrapper[4687]: I0228 10:03:25.933352 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:03:25 crc kubenswrapper[4687]: E0228 10:03:25.933834 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:03:40 crc kubenswrapper[4687]: I0228 10:03:40.658011 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:03:40 crc kubenswrapper[4687]: E0228 10:03:40.659346 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:03:55 crc kubenswrapper[4687]: I0228 10:03:55.656975 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:03:55 crc kubenswrapper[4687]: E0228 10:03:55.658183 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.143650 4687 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537884-trxcf"] Feb 28 10:04:00 crc kubenswrapper[4687]: E0228 10:04:00.146315 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eaa295e-8638-44ef-907d-ac9a0f19791b" containerName="oc" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.146435 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eaa295e-8638-44ef-907d-ac9a0f19791b" containerName="oc" Feb 28 10:04:00 crc kubenswrapper[4687]: E0228 10:04:00.146525 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="copy" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.146580 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="copy" Feb 28 10:04:00 crc kubenswrapper[4687]: E0228 10:04:00.146651 4687 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="gather" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.146706 4687 state_mem.go:107] "Deleted CPUSet assignment" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="gather" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.147109 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="copy" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.147198 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eaa295e-8638-44ef-907d-ac9a0f19791b" containerName="oc" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.147269 4687 memory_manager.go:354] "RemoveStaleState removing state" podUID="1640ed83-395f-4d74-85f2-846f87f43da0" containerName="gather" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.148408 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.152910 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.152920 4687 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.153316 4687 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fl562" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.154761 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537884-trxcf"] Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.241375 4687 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj266\" (UniqueName: \"kubernetes.io/projected/f412abd5-c9ec-47d3-ad79-c34a4b6550ac-kube-api-access-sj266\") pod \"auto-csr-approver-29537884-trxcf\" (UID: \"f412abd5-c9ec-47d3-ad79-c34a4b6550ac\") " pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.344340 4687 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj266\" (UniqueName: \"kubernetes.io/projected/f412abd5-c9ec-47d3-ad79-c34a4b6550ac-kube-api-access-sj266\") pod \"auto-csr-approver-29537884-trxcf\" (UID: \"f412abd5-c9ec-47d3-ad79-c34a4b6550ac\") " pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.362925 4687 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj266\" (UniqueName: \"kubernetes.io/projected/f412abd5-c9ec-47d3-ad79-c34a4b6550ac-kube-api-access-sj266\") pod \"auto-csr-approver-29537884-trxcf\" (UID: \"f412abd5-c9ec-47d3-ad79-c34a4b6550ac\") " pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.468780 4687 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:00 crc kubenswrapper[4687]: I0228 10:04:00.880655 4687 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537884-trxcf"] Feb 28 10:04:01 crc kubenswrapper[4687]: I0228 10:04:01.260522 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537884-trxcf" event={"ID":"f412abd5-c9ec-47d3-ad79-c34a4b6550ac","Type":"ContainerStarted","Data":"af12137f3f059e446ea8a7210088dd290559c831c89db51e89de54fe80f4e164"} Feb 28 10:04:02 crc kubenswrapper[4687]: I0228 10:04:02.270673 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537884-trxcf" event={"ID":"f412abd5-c9ec-47d3-ad79-c34a4b6550ac","Type":"ContainerStarted","Data":"4a468c3fa8facf36924ddd839cca2c3b7783b4f4253ca6d0721820df97eee15a"} Feb 28 10:04:03 crc kubenswrapper[4687]: I0228 10:04:03.282910 4687 generic.go:334] "Generic (PLEG): container finished" podID="f412abd5-c9ec-47d3-ad79-c34a4b6550ac" containerID="4a468c3fa8facf36924ddd839cca2c3b7783b4f4253ca6d0721820df97eee15a" exitCode=0 Feb 28 10:04:03 crc kubenswrapper[4687]: I0228 10:04:03.283040 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537884-trxcf" event={"ID":"f412abd5-c9ec-47d3-ad79-c34a4b6550ac","Type":"ContainerDied","Data":"4a468c3fa8facf36924ddd839cca2c3b7783b4f4253ca6d0721820df97eee15a"} Feb 28 10:04:03 crc kubenswrapper[4687]: I0228 10:04:03.544710 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:03 crc kubenswrapper[4687]: I0228 10:04:03.725941 4687 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj266\" (UniqueName: \"kubernetes.io/projected/f412abd5-c9ec-47d3-ad79-c34a4b6550ac-kube-api-access-sj266\") pod \"f412abd5-c9ec-47d3-ad79-c34a4b6550ac\" (UID: \"f412abd5-c9ec-47d3-ad79-c34a4b6550ac\") " Feb 28 10:04:03 crc kubenswrapper[4687]: I0228 10:04:03.745198 4687 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f412abd5-c9ec-47d3-ad79-c34a4b6550ac-kube-api-access-sj266" (OuterVolumeSpecName: "kube-api-access-sj266") pod "f412abd5-c9ec-47d3-ad79-c34a4b6550ac" (UID: "f412abd5-c9ec-47d3-ad79-c34a4b6550ac"). InnerVolumeSpecName "kube-api-access-sj266". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 10:04:03 crc kubenswrapper[4687]: I0228 10:04:03.829082 4687 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj266\" (UniqueName: \"kubernetes.io/projected/f412abd5-c9ec-47d3-ad79-c34a4b6550ac-kube-api-access-sj266\") on node \"crc\" DevicePath \"\"" Feb 28 10:04:04 crc kubenswrapper[4687]: I0228 10:04:04.296160 4687 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537884-trxcf" event={"ID":"f412abd5-c9ec-47d3-ad79-c34a4b6550ac","Type":"ContainerDied","Data":"af12137f3f059e446ea8a7210088dd290559c831c89db51e89de54fe80f4e164"} Feb 28 10:04:04 crc kubenswrapper[4687]: I0228 10:04:04.296210 4687 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af12137f3f059e446ea8a7210088dd290559c831c89db51e89de54fe80f4e164" Feb 28 10:04:04 crc kubenswrapper[4687]: I0228 10:04:04.296285 4687 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537884-trxcf" Feb 28 10:04:04 crc kubenswrapper[4687]: I0228 10:04:04.606657 4687 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537878-rxfwt"] Feb 28 10:04:04 crc kubenswrapper[4687]: I0228 10:04:04.612478 4687 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537878-rxfwt"] Feb 28 10:04:04 crc kubenswrapper[4687]: I0228 10:04:04.665633 4687 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812e6bb9-47ab-4e7a-9376-6337b4968de0" path="/var/lib/kubelet/pods/812e6bb9-47ab-4e7a-9376-6337b4968de0/volumes" Feb 28 10:04:09 crc kubenswrapper[4687]: I0228 10:04:09.657010 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:04:09 crc kubenswrapper[4687]: E0228 10:04:09.658410 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:04:22 crc kubenswrapper[4687]: I0228 10:04:22.657279 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:04:22 crc kubenswrapper[4687]: E0228 10:04:22.659277 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:04:35 crc kubenswrapper[4687]: I0228 10:04:35.657413 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:04:35 crc kubenswrapper[4687]: E0228 10:04:35.658521 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:04:44 crc kubenswrapper[4687]: I0228 10:04:44.785935 4687 scope.go:117] "RemoveContainer" containerID="6a188535abca0d0acd989b48e413b5a8abcba88fc2373fb5bc78084099db5c25" Feb 28 10:04:44 crc kubenswrapper[4687]: I0228 10:04:44.809700 4687 scope.go:117] "RemoveContainer" containerID="7402f77759435d60461a001d49fe8652ae8b0b7ac26f0a4ac79d68967420a4d5" Feb 28 10:04:48 crc kubenswrapper[4687]: I0228 10:04:48.662569 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:04:48 crc kubenswrapper[4687]: E0228 10:04:48.663506 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:05:02 crc kubenswrapper[4687]: I0228 10:05:02.657989 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:05:02 crc kubenswrapper[4687]: E0228 10:05:02.659398 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:05:14 crc kubenswrapper[4687]: I0228 10:05:14.657992 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:05:14 crc kubenswrapper[4687]: E0228 10:05:14.658992 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:05:25 crc kubenswrapper[4687]: I0228 10:05:25.657225 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:05:25 crc kubenswrapper[4687]: E0228 10:05:25.658129 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1" Feb 28 10:05:37 crc kubenswrapper[4687]: I0228 10:05:37.658120 4687 scope.go:117] "RemoveContainer" containerID="5d9028a98f26994b531f99d84668faf12d778306d75d6630b5778ce546d19200" Feb 28 10:05:37 crc kubenswrapper[4687]: E0228 10:05:37.659243 4687 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sbkqn_openshift-machine-config-operator(dcd48dfa-192a-4a5b-be30-fc7eebc90da1)\"" pod="openshift-machine-config-operator/machine-config-daemon-sbkqn" podUID="dcd48dfa-192a-4a5b-be30-fc7eebc90da1"